“Young, comfortable and inclined toward creativity, they enjoy a utopian-seeming existence marked by strolls down tree-lined streets, carefully chosen foods and leisurely weekends spent in coffee bars beer gardens and parks.”—
Givhan goes on to say that, in matters of style, Kagan is “unabashedly conservative,” and the piece is an attempt to convey, as Tim Gunn puts it, the semiotics of style—the idea that every part of your wardrobe says something about you.
I’m honestly glad Gunn is using the term “semiotics of style” — semiotics doesn’t get enough play on reality TV, where it might be of most use. Nonetheless, I’m sure he’d be the first to admit that the coinage is Roland Barthes’, in The Fashion System (Système de la mode), his early book on the meaning-making of women’s fashion writing.
Time — tracking and coordinating its passing, that is — is a problem that plagues every coder. Casually, keeping track of time on a computer seems like it would be a simple matter of counting upwards, and occasionally making sure that your count doesn’t fall behind (the modern version of winding a watch). But take even one pick at that onion and you’ll find stunning complexity.
Many of the nastiest implications of time have been abstracted away or veiled from the everyday programmer’s view. We don’t often need to think about the fact that the contemporary definition of a second, the basic unit of time, as defined only 40 years ago is:
The duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom.
Nor do we have to worry about the fact that the Unix timestamp, the de facto standard for programatically marking a moment in time, bears only a tenuous relationship to the most commonly used of three “Universal Times.” And we are also thankfully shielded from the international politics that landed on the unholy acronym for this standard brand of time, “UTC,” which is natural to neither of the warring parties: French (“Temps universel coordoné”) and English (“Universal Coordinated Time”). Did I mention that Unix time is happy to run backwards or forwards from time to time to meet up with its friend UTC during leap seconds?
Assuming that you’re a judicious enough programmer to use this zany representation of time correctly, translating it into the world’s 40+ time zones with a smile, adding and subtracting from it without making any mistakes, and even letting users enter it in whatever foolish form makes sense to them, you’re probably still doing it wrong in a dozen edge cases you hadn’t considered, or in the dustier corners of your code.
The problems with time become grossly compounded when multiple computers, multiple programs, and even parts of programs on one computer are trying to get a little work done on some chunk of data. In this arena, even the insanity of the Unix timestamp cannot help us. Imagining a cluster of servers as a giant deli, when you have thousands of sandwich-makers and billions of sandwiches, doing things out of order (putting the meat on the bread before it’s been sliced) and modifying old information (trying to put the meat on a piece of bread that’s already been half-eaten) are incredibly common issues without simple solutions.
It is profoundly impractical for the many sandwich-makers to have perfectly synced micro-scale wristwatches, let alone to write this bit of time on every piece of lettuce they touch. Yet, because people expect their sandwich to come out fresh and correctly made in the end, a whole branch of active research is dedicated to devising ways to keep all those slices of bread and chefs coordinated. The most common approach in this situation is to toss the human clock out the window and instead use “logical clocks.” Taking a slice out of an onion might advance this idiosyncratic clock, which is tied to the rhythms of the process it is being used for only (thus “logical”), not to that fluttering cesium atom, or its even weaker numeric representations. Logical clocks track a totally subjective, yet highly interdependent conception of time — a fascinating philosophical shift, not to mention a new set of challenges for the time-weary programmer.
All of this by way of explaining my excitement at this paper [PDF], a flavor of logical clock, which is an elegant solution to both distributed sandwich-making issues (versioning and causality). It’s a pretty, shiny thing, my interest in which I hope this diatribe goes a bit towards explaining.
[The hardcore will forgive my lazy analogy, which doesn’t quite accurately represent the difference between data and causality dependency and ignores the likely presence of many many cloned sandwiches, not to mention the frequent and unpredictable death of deli workers.]
This Richard Nash talk that’s been flying around today is a rare example of a rational and positive assessment amongst all the depressed navelgazing that passes for discourse on contemporary publishing.
There’s a certain category of things that people like to fetishize as “sinister” but which are actually pretty neutral on the moral scale. I’m thinking of: genetically modified crops, Imelda Marcos, New Urbanism, etc.
As you would expect with a well-intentioned theory applied on a mass scale to the places we live, the manifestations of New Urbanism are a mixed bag. Purely from an aesthetic standpoint, true, the impulse to assign a moral judgment is overblown.
What makes your hair stand on end in so many of these places, though, is the sense that there is a more sinister politics behind the conversational porches and exactly prescribed understated signage. Bad New Urbanist developments feel like the kind of project that a totalitarian dictator would take up in retirement: ships in a bottle occupied by living people.
Truly sinister politics and the New Urbanist master planning wet dream come together no more perfectly than in Daybreak, Utah, a massive suburban project built atop the waste from a still-active copper mine, itself large enough to be seen from the moon. This article by Lucy Raven (including sideshow and creepy “Daybreak Welcome Center” interviews) has all the unsettling details.
We don’t see many or most of the display ads we see online (this post isn’t talking about search ads btw). And it’s more than just “blindness”. I’m told over and over again the most popular firefox extension is ad blocker. And some of my favorite extensions like rapportive blow out the ads on gmail to make gmail better.
As a result of all of this, ads need to get smarter. That’s just evolution at work.
I disagree. No matter how “smart” ads get, there are going to be many people who, in some or all contexts, object to them enough to ignore them or seek out solutions to disable them.
If you currently block ads, is there anything the ads themselves can improve that would make you change your mind? (I’m guessing there isn’t.)
While I agree with Marco that there are many people who will always object to ads in any form, the size of that “many” is getting smaller and smaller by the day. This is partially thanks to over-saturation, but also due to the form of advertising of that Google has made ubiquitous (subtle, highly targeted, whether display or text). For example, the Fusion ads that show up in my unregistered Tweetie? I don’t particularly mind them — they’re often for things I know about or wouldn’t mind knowing about, are easy to identify, and don’t significantly impact my use of the app. Especially as the next generation — for whom these kinds of advertising are an expected, accepted, and often welcome reality — grows to become consumers, smart ads will have a place.
Moreover, the vast majority of users who currently cannot or are not willing to put down cash (in the form of “support” or “membership” fees) to get rid of ads are the same set of users that is least likely to have a negative perception of advertising. There’s no value judgment here — in fact, I think uncritical complaints about advertising can be just as insipid as the ads themselves. Except for this small “elite” of users who perceive themselves as being beyond advertising, a smarter ad would be a boon.