Bundle of joy

[Questions at the end]

del.icio.us has a new “bundle” feature that allows for non-exclusive categories to contain tags. All it does now is break your tags up into headings, but it’s clearly intended to provide the second level of hierarchy as user’s tag lists sprawl beyond manageability. At first I was freaked out because I wasn’t sure whether I wanted to further categorize my tags. The nice thing about del.icio.us is the way that it encourages you to make a thousand eccentrically tagged piles and not worry about careful taxonomy. Each tag has a meaning for you, and when you want to find something again you look at the list and back-trail just one of your several associations. But then it’s social software, and so you take into consideration both the intelligibility of your tagging for other people and, possibly, their respect for your ordering scheme. (Certainly, one avoids telling associations. See, for instance, someone’s “funny” tag.)

I found with the bundling that I was self-conscious about making stupid general categories. For instance, I tend to throw everything “humanistic” into [literature/philosophy/art], because I have trouble separating out, say, critical approaches to works of art. But there’s something else that I consider [culture] that involves pop art, current events and light news stories (but not [politics]) and sociology. The other thing to keep in mind is that the bundles are non-exclusive, but that they contain tags, ie., sloppy on-the-fly categories, mostly chosen based upon associable attributes. Before I started writing this entry, I was chirping to myself, “It’s non-exclusive—no worries!” and even mused to myself that the law of noncontradiction only applies to things in the same time and place and respect, so I was okay. Now, I’m upset. My innocent, free-form, self-expressive tagging activity has been opened up to all this heavy stuff about the right way to chop up being.

Take, for instance, shirt. There’s this site that showcases different t-shirts. Right now, I put the tag “shirt” in [information] with my “apartment” tag. Now, if I decided that t-shirts were legitimately a form of art, then I would have no problem adding the tag shirt to the [literature/philosophy/art] bundle. What doesn’t make sense about this, though, is that usually an item tagged shirt is either going to be of informational or artistic value! So if you click on the [information] bundle and find shirt items in it, you don’t know whether they have anything to do with shirt qua information…

This problem is the reason why no librarian would ever use a del.icio.us-esque system for categorizing elements in an inventory. Actually, I think that tags tend to be backwards to normal taxonomy, because they’re not essentialist (e.g., biological taxonomy) but pull together disparate things by aspects. I think we choose the tags we use according to guesses about the relevance of the set of items that will share a given aspect, and I know I tend to discriminate against things that would make the contents of a tag too divergent and make a new tag instead.

We do the same thing on Google: we pick the words (aspects) that we think would be included in the type(essence) of page we’re looking for. (This is induction, right?) Here’s a question: do we actually have the concept of the page in our heads before we find what we want, or is this mostly a story we tell ourselves after the fact? I think the latter is the current (computerized) information science perspective; also, I have heard librarians complain about Google-style relevance searches applied to their Dewey-categorized catalogs. What the new IS people say is that old-school library science is tied to systems of categorization for physical elements, e.g., books, which actually have to reside in one place on a shelf (unless the library wants to keep multiple copies). So when the new IS people get excited they say that the same information can reside in many “places” and can be represented at the same time to many people in different ways. That’s why the del.icio.us-style tagging and relevance searching, because it’s easier to point to the thing that you’re looking for in the location where you know to look for it than to figure out deductively its location in a hierarchical system, especially if that system is impossibly big.

Except that physicality and distinctness keep reintruding. The W3C treats URIs (like http://www.stupididea.com/method/) as distinct locales that are always just one thing, at the same time and in the same respect (e.g., cgi inputs). For another thing, information always occupies a physical location, usually in the bits on a harddrive, although multiple identical instantiations of an item of information can exist simultaneously in different places. Since each instantiation of an item of information has the potential to be altered, W3C uses URIs to establish the unique identities of authoritative documents. Further, while most documents are texts that allow for a wide degree of interpretation of intent and the nature of contents, many documents on the web are XML-like data files that rigidly describe a set of contents which are defined in relation to explicit definition files. These documents carry their own intended meaning, simply a set of hierarchical and associational relations between values (for example, a Friend-of-a-Friend (FOAF) file binds together values of people’s names, addresses, etc. into a network of relations).

Noncontradiction holds for information as well, if one takes “in the same respect” to mean something like “in the same system of categorization”. Moreover, relevance search and tagging, or reverse look-ups based on aspects, approach essential, ie. hierarchical categorization as more terms or more tags are used to single out an item. In Google, you should find only one document (and copies) once you reach the absurdity of searching for every word in that document in sequence. A del.icio.us bookmark list will be trash if every item is tagged with every tag, and bundling will only provide more information rather than less if tags are chosen to work rationally together with a carefully distinguished set of bundles. (I might be misunderstanding something, though. These bundles confuse me).

For example, I had a tag named dog that was neither [people] nor [information] nor [culture], so I put it in its own bundle named [miscellaneous]. This is real lame, so I considered anticipating future tagging needs by creating an [animal] bundle, set against [people]. I realized how funny it was that in my supposedly informal categorization activity, I was being pushed toward established classification schemes (if I have [people] and [animals] I should have [places] and all sorts of things like [rocks] and [concepts]). Of course, I’m just imposing my prejudice for philosophically justified hierarchies onto a system of aspect-based tagging that is intended to provide a quick and dirty solution to mass organization of materials, right?

Okay, I’m going to stop now. Sorry for the incoherence and bad writing. My questions are:

  • (asked by Mr. David to Amy T) “Is there more than one way to divide up being?”
  • Does the need to reconcile the categorization of a large number of items in a system force specificity and thus truthfulness, or does it strain rationality, or is it just a perverse way to play a game of limited interest?
  • How should I organize my del.icio.us using bundles?
  • What is the being of a combined tag (like blue+car)?
  • Should we force computers to make inferences based on philosophically rationalized datasets? Does this make sense, or is this possible?

Stop the presses

Update: hb and Robbie P say it isn’t so.

Like a thunderbolt out of the blue, it hits me: Not so much, as in “Is Thomas Friedman a genius? Eh…not so much” is a new joke meme! See, it’s a joke that you use the English phrase indicating a reduction in degree when discussing a matter where it’s commonly assumed that there is no possibility for variations in degree, only in kind. I think that this meme has been perpetrated perhaps by Jon Stewart, where he uses it as a Jewish joke about excessive equivocation where the speaker has an obvious unequivocal opinion. I can’t remember the exact material, but it’s something like, “Do I think he’s a douchebag? Eh. ‘Perhaps’. Do I think he’s a ‘decent person’? Eh. Not so much.”

(This makes sense, too, because I’m thinking in particular of the Jon Stewart show immediately after his Crossfire appearance, which many people would have watched and remembered).

Here is the example that alerted me. The idea here is that the ugly roller mice do not appear to be ergonomic at all. It is my contention that this man, this blogger, has watched Jon Stewart and, consciously or unconsciously, picked up the “not so much” joke meme.

Further, I was myself aware of the joke meme because I have exhibited a tendency to use this joke, even though I don’t regularly watch Jon Stewart (although I did watch the post-Crossfire show). I now believe that I got it indirectly from from Hayden B., an avid Stewart viewer. I have in turn transmitted it, although perhaps in a weaker concentration, to Amy T. This indicates that “not so much” is a joke meme, rather than simply a habituated imitation of Jon Stewart, because then only the direct regular viewers would be affected. The etiology seems to be that certain primary carriers, like Hayden and perhaps the blogger, contracted the meme directly from the media source and then transmitted it to secondary carriers with a basic receptivity. The all-important stage in the proliferation is the passage of the meme to the tertiary carriers, who may have never directly witnessed a Stewart “not so much” incident.

My thesis, then, is that “Not so much” is a linguistic and humoristic meme operating at such a basic level in our language and transmitted to such a circumscribed crowd that its emergence and proliferation has passed virtually unnoticed. Until now.

Moreover, I will venture the following conjectures: 1) The overall extent of transmission will be limited by the relatively small population of people with receptivity to the humor and the insularity of the subpopulations of same. 2) The relative weakness of the infection in tertiary carriers will prevent further transmission, thus linking the continued viability of the meme to the persistence /humoristic promiscuity of the primary carriers. 3) Primary carriers are likely to be “media junkies” with high susceptibility to memes in general; thus, more recently acquired memes will tend to edge out older memes 4) Eventually, the primary carriers will forget about the “not so much” joke meme and the secondary carriers will be disappointed to find their attempts at hip humor rebuffed as “lame” 5) The meme may have reached its peak exposure and may already be in the process of recession and disappearance entirely 6) Jon Stewart’s continued media presence may have the ability to periodically resurrect the meme for short bursts of renewed activity 7) A careful linguistic epidemiological study in newspapers/magazines and on the web of 2004-2005 might reveal a pronounced spike in incongruous usages of “not so much” that would support the Jon Stewart “Not So Much” Joke Meme Hypothesis.

Am I employed yet? Eh. Not so much.

Another Letter.

This is from an email I sent around. Thought I’d try one more time.

Dear friends and acquaintances,

I own and pay for a vastly underutilized web domain called stupididea.com. It has resources to host many more images, a tremendous amount of text, perhaps some video. The following is the SHORT PITCH:

You may find of interest the following resources:

THE LONG PITCH:

Stupididea.com’s original concept centered around the wiki at
http://www.stupididea.com/mediawiki/ (running the same software as Wikipedia), where the plan was to accumulate “stupid ideas” and through debate and collaborative editing improve them to the point of being standalone white paper proposals. Later, I found that this concept has been well-implemented at halfbakery.com and shouldexist.org. I still work on my own stupid ideas, and invite others to do so, but I would like to open up the wiki to any kind of collaborative project/publication. For those who don’t know, the first principle of wiki publication is “be bold”, meaning don’t bother to ask permission first, only be considerate to other people’s work.

Everyone on this initial mailing is a Johnny, and the intention is for this to be a low-volume site, focusing on the specialized, esoteric, cult-like discourses and interests we all know and love. Making the wiki a more interesting place to visit would be a good use of the bandwidth, but I am eager to provide other services that people would find useful or gratifying. In particular, I can set you up with one of those weblogs you all have been hearing so much about (I can also install any open source web software that requires PHP and MySQL—search freshmeat.net). Some other ideas:

  • A listserv with a circumscribed, non-mailbox-clogging purpose (for example, as an expert knowledge query server: “Hey, what exactly is the deal with Boethius?”, “What things should I lie about on a law school application?”)
  • A bulletin board (forum)
  • A webzine publication (my suggestion: “The New Good: Culture and Ethics”)
  • An open-access photo gallery (ala http://www.flickr.com)
  • Your idea here
    —[Method]

    P.S. The only area where I intend to exercise a heavy editorial hand is in disallowing certain subjects that are amply served elsewhere on the Internet. This relates specifically to anime (but not to g-novels).

    I should be more explicit: if people would like their own web space (under a cool domain name, not you.bloggoober.com), would like to serve up their own media files (within reason), want space to start a project, just contact me at my email at the bottom of the page. I’ll give you ftp access and will help out with technical things.

Defeated!

So I said that I wasn’t sure, and now I remember why.

  • “you can’t selectively historicize.”-bill

[You can’t selectively historicize] if you’re going to approach a wide range of thought. And since it seems best to use history to give context to thought, rather than to explain thought, and since it seems wise to approach thought and history both separately and apart, it is “safe” to largely ignore history for the duration of a 4-year education. Some people will take the wrong cue, I think, and conclude that a study of history can be dispensed with entirely, since it seems superficially that thought can be solely guided by supposedly universal considerations, but whatever. They grow up or they don’t.

  • “I’m glad I’ve met at least one person who was an open monarchist, instead of all the crypto-monarchists” -bill

Further, as rb points outs, it may be a necessary condition for rigorous thinking about democracy and politics that one flirt with monarchism and antidemocratic thought for a time. I do think that you have to eventually learn to love democracy out of a respect for justice, rather than out of a rather coldhearted acquiescence to historical forces. Also, as bill points out, intellectual honesty in discussion and a true, rather than ironic, commitment to democracy seem to go together.

  • “it’s best for the freed minds to return to their own time outside of St. John’s’ classes” -hb

On the wisdom of this, I might be convinced that the senior year program overreaches, although I’ll have you know I like Wittgenstein. Actually, I’m not sure that we get that much out of Heidegger…

  • ”“monarchism” at St. John’s is a non-entity” -rp

[Monarchism at St. John’s is a non-entity] as a substantial sect, although I believe that anyone from the outside would perceive a, like, patina of antidemocratic sentiment, especially toward the end of senior year, that would have to be described as something like monarchism. I should also mention that the idea of “monarchism” as a coherent political position in America amuses me, which is why I keep bringing it up.

  • “Johnnies leave about as heterogenous as they come in”-rb

Your point is well-taken, especially about objectivists and Mabel. It could even be argued that the degree of genuine intellectual diversity, as compared to some imagined university, is exceptional. It spans a political spectrum, includes religionists and atheists, and folds in various subcultures. Yeah, amongst the melee kids, the stoners, the ravers, the hiphoppers, the waltzers, the cape people, the gym rats, the math kids, the objectivists, the environmentalists, the protestants, the catholics, etcetera, etcetera, there are some dyed-in-the-wool monarchists, no joke, but again, whatever.

More, later.

Note.

One of these days, I want to write something about intellectualism and madness, particularly paranoia. Some initial points:

  • The phantasmal nature of intellectual abstraction (how in proposing to give an account of the absolute, intellect introduces something otherworldly and with a life of its own into the world).
  • The paradox that suspiciously simple abstractions (right, law, soul, sin, cause, evil) cannot be shaken because of their utility and naturalness while efforts at giving more concrete accounts (through various characterizations of the so-called absolute) result in seemingly bizarre concepts and ever-more difficult prose, thus reducing the practical utility and, somehow, worldliness, of the supposedly concrete accounts. See, for example, Being and Nothingness as an attempt to express an astonishingly commonsensical proposition in just about the most convoluted and dizzying way possible.
  • How the pursuit of the “concrete abstraction” (or the dream of complete conceptual represenation) lead, in the 20th century, to various forms of aphorism, parable and symbolic speech intended to influence intellectual thought. How the resultant difficulties in interpretation created confusion for intellectuals split between their inability to directly assess the arguments in texts and the difficulty of choosing between schools of thought resposible for presenting modes of interpretation.
  • The paradox restated: how the direct utility of theory decreased in inverse proportion as its ability to describe the fullness of ethical and social realities (e.g., Foucault’s crystalline matrices).
  • The unhealthy habits of intellectuals (reading, isolation, writing and perhaps stimulated inwardness).
  • The problem of knowledge complexity generally
  • Let me stop here. Is anyone interested in this, or is it boooring?

A Letter

I’m not going to send this, because I’m no longer sure about it. And no, I don’t normally think about this. As should be clear from previous entries, I spend most of my time thinking about apocalypse, robots and, of course, robotic apocalypse.

Dear Gadfly,

As an [alumnus], I would normally be too self-conscious of my role as a has-been to write a letter to the student-tutor publication of the school. But it suddenly occurred to me (really!) while writing a philosophical genealogy for my own use (really!) that there is an alarming, and possibly damning, hole (really a cap-stone) in the program. In the senior year, the program ventures into what are termed Modern (as opposed to Enlightenment, Scholastic and Classical) and (depending on labelling criteria) Postmodern philosophy. Most notably, we now read Wittgenstein and Heidegger, after having read Freud, Nietschze and Conrad. It seems to me that there is a huge elephant in the room when we read both Heidegger and Wittgenstein, and it has something to do with the intervening years between them and the latter group.

I hope that it will not be considered mere historicism to suggest that we may need to understand something about this period to understand Wittgenstein’s relationship to language and to idealism. And I would be the last to condemn Heidegger’s thought solely on account of his actions, but shouldn’t we ask ourselves what extra information we might need in order to evaluate his doctine of “attunement” to the “beings of being”?

Of course, I am wondering why we don’t have a reading that relates directly to World War II or the social and philosophical conditions that lead to NAZIsm in Germany. What just now occured to me was that the thought of the second half of the 20th century, which we are surely trained to consider (as it is the thought of our time) will not make sense unless we are presented with the war that has stood, for so many, as the proof of a somber thesis about Western Civilization and the nature and desirability of absolute truth. As always, we would not be required to reach any prescribed conclusion, nor would it be possible to force a senior to conclude anything. But to not present the War is, simply, to paint a false picture of the history of Western civilization and thought.

Sincerely,
[Method]