Pages

Thursday, May 29, 2014

Selfies

The text from my presentation at LACMA in April 2014. From a book chapter in progress on selfies as life-writing.

----
Here’s the TL;DR of what I want to address very briefly here: first, selfies considered through the lens of Roland Barthes’ notion of the punctum; second, the development of formulaic and conventional modes of self-representation in seflies, considered as studium; third, the social disciplining of selfies as visual rhetoric and communication practice: selfie song, selfies at funerals, skinneepix, striation of services. This is to suggest a number of avenues into the topic that I am exploring at length elsewhere.

The Ineffable: Selfies as Punctum 

In Camera Lucida, French semiotician and poststructuralist Roland Barthes turns his back on these fields he helped inaugurate, and instead considers the ineffable essence of photography, what he calls the punctum. The punctum, literally, is that which pierces. Barthes began from the problem of trying to understand why some photographs of his mother move him and some don’t. He finds one of her as a child (“The Winter Garden Photograph”). It moves him, it has the punctum, it is the essence of her. Its meaning inheres not in the photo itself but in his own personal, subjective relation to it. He write, in a long, characteristic parenthesis: “(I cannot reproduce the Winter Garden Photograph. It exists only for me. For you, it would be nothing but an indifferent picture, one of the thousand manifestations of the ‘ordinary’; it cannot in any way constitute the visible object of a science; it cannot establish an objectivity, in the positive sense of the term; at most it would interest your studium: period, clothes, photogeny; but in it, for you, no wound)” (Barthes 73). Probing the wound inflicted by this photo, Barthes decides to abandon his formalism: “I had understood that henceforth I must interrogate the evidence of Photography, not from the viewpoint of pleasure, but in relation to what we romantically call love and death.”

This reminds me of Dear Photograph. Those photos are about the very act of undertaking this process that Barthes describes--they’re layered self-portraits of love and death. From this site, we can see that some of what enrages us about social selfies, perhaps, is the mismatch between their deeply personal meaning, and their broader irrelevance, when broadcast.


The Dear Photograph images are uncanny and insipid at the same time in ways that are unavoidable but instructive. There is something generalizably and shareably jarring in the forced layering of past and present that is the generic unifying element of all these photos. That’s the uncanny part. There’s a studium element at play here that's really effective at drawing attention: the trick and the convention of seamless layering, the closer the fit the more jarring the effect. The punctum, though, is personal: it’s what the people in the first photo mean to the owner of the thumb in the picture. The life flashing before the eyes of the one person for whom it is meaningful. This is probably why the captions strike us as so terrible: mawkish, sentimentalized, like what you’d see in a greeting card inscription. Often the inscriptions/authors are talking to the subject of the photograph directly (“Mom, I want you to know that we turned out all right”) rather than to the actual audience of the shared and staged photo, which would be the internet at large. Dear Photograph is all about the punctum rather than the studium. And the punctum is meaningful only to the individual who sees it: the Dear Photograph selfie is a representation of hte moment of piercing; it shares what cannot be shared. It is compelling, and impossible.

 The Mannered: Selfies as Studium 

Of course, let’s admit it’s often hard to see where the opportunity for punctum might exist in what feel like an endless parade of interchangeable overlit facial downshots and bathroom mirror moues.

This is the studium of the selfie, the more deliberate and formal characteristics of the photographic image. I think we are learning new ways of arranging our bodies to be photographed, and deciphering these arrangements as critical viewers of photography, rather than personal subjects seeking out the connection to friends or family depicted in images. In interesting ways, “selfies” are becoming “professionalized” in the kinds of self-consciousness and artfulness that subjects are bringing to posing and staging, but there’s a really limited and possibly limiting kind of self-expression that results. Duckface and the bicep pop are two well-known examples of self-consciousness in front of and behind the camera. Selfies are developing a kind of quick-start grammar that arranges poses and situations in formulaic ways, in a move I find analogous to the way that Instagram’s set of filters offer a quick start way to achieve professional printing effects by non-expert. It is rough and ready, and it is functional

Historic selfies seem like questioning selfies: people are looking in mirrors, pointing their cameras, looking at their reflections. They look like they’re asking something: do you see me the way I see me? Is this who I am? We look different in selfies than in the photos others take of us, in which we are posed and staged according to someone else’s aesthetic, or according to the logic of the occasion. In a photographic portrait, the artist has the agency, not the sitter, really. Self-portraits are different. We direct the shots, to the limits of our technical capacity: it’s not a snapshot, 3 exactly, because it’s staged; but it’s not a formal portait either, because the only audience we’re seeking to please is our own self. The selfie is an autobiographical act, rather than a biographical one.

And then smartphones.

Of course, photographic skills and creative vision are not equally distributed among the population and it should be unsurprising that most people who take selfies are not well-versed in the qualities of morning or evening light, the notion of distortion and lens length, depth of field, or shot angle. And yet, they do know how to scrutinize themselves, and they consume massive amounts of celebrity and popular culture images daily. People try to reproduce the images they see, the ones they find attractive: they discern a formula, they try to follow it. The ubiquity of smartphones leads to a whole new vernacular language of self-photography: because these photos are by their very nature shareable, norms and standards develop very quickly through mass distribution and mass social filtering--”like” or not. This, arguably, squashes the conversation between self and camera that prompts the selfie: less a questing desire to capture the essence, and more adherence to convention, to demonstrate belonging.

Some of the formula has to do with the technology: rear-facing cameras lead to the bathroom mirror aesthetic. Front-facing cameras give us foreshortened arms and the down-shot and the close-crop.

Some of the formula derives from prior formulaic photographic genres. In particular, celebrity red carpet poses. Girl group shots look a lot like red carpet setups now. You will no doubt recognize the part-sideways stance, the one leg thrust forward and bend a little, the forward chest and the bent arms. Maybe we can talk later about the codes of conventional attractiveness this indicates, and the desire to always look thinner. A porn aesthetic is also manifest in a lot of selfies: too-tiny clothes or cheap lingerie, grabbing one’s own boobs or butt, looking over the shoulder. In this modes, there’s a lot of T+A. The more upmarket version of this aims to replicate more of a Vanity Fair aesthetic, half-open mouth, mussed hair, narrow depth of field. The hipster version is the ironic half-face in the photo, the tilted angle, the weirdo filter: this mode is more or less equally avilable to men or women. There is no one kind of selfie, but rather representational codes and conventionalized expressions that are legible and normative within communities of practice.



These are codes of self-presentation, stereotyped, right? There’s a formula, and learning it can compensate quickly and easily for lack of fluency or training in the photographic medium. Doing it well indicates and reinforces social belonging, rather than aiming to capture the uniqueness of a given self. Selfies, then, might be less a mode of static self-portrait, and more a dynamic communicative practice.

Social Discipline and Selfies 

Selfies, I’m suggesting, are communicative acts: they are of me, but they aim to speak to others. As such, they are social. As they are social, they are subject to norming and to discipline.

Skill at selfies is not about being a good photographer, it ‘s about being a good consumer of images of others, of learning how to discipline your subjectivity into preset representational moulds. This would be contra Barthes, who describes instead “the impossible science of the unique being”--but no one really wants to present that, or to see it. That would be too raw, and the audience is much too broad for this kind of intimacy. We need workaday photos, photos that don’t sear the soul with their unexpected justesse.

The body can be disciplined through apps if not in real life. Generally, it should be thinner. Skinneepix is a kind of Photoshop liquefy macro: “pounds lighter” is the unit of measurement for this distortion it wreaks on your depicted face.


And it only works if you take the right kind of selfie. No profile. No half-face. Just full mug shot.


Social practices are disciplined through social media and mainstream media culture. Most savvy users know now that Facebook is for selfies only very rarely and for profile pics. Instagram is more selfie-friendly, and these must be aestheticized: Instagram selfies are pretty. SnapChat is a selfies free-for-all: you can let it all hang out, and not just your boobs, but also your really goofy faces, and all the rest of it.



Too many selfies on Facebook and you’re a narcissist getting downvoted by your friends. The right kind of selfie on Instagram looks beautiful but not too forced: and don’t get caught Photoshopping yourself! And I’m really fascinated by the “#Selfie” song, by The Chainsmokers. It cashes in on, and mocks, and disciplines contemporary culture: in the song, a valley-accented club kid takes and talks about selfies, and uses them to communicate with others as well as to preen and seek ego validation. Like “Valley Girl” in the 1980s, it codifies and crystallizes a cultural moment and a set of in-group behaviours and language, to both celebrate and belittle. Susan Bright sent me just this morning a link to “Elders React to #Selfie” which is more social sanctioning: it just ramps up the judgment and the in-group status of youth, completely mystifying old people. As the song says, "Oh my god, for sure, for sure."

Selfies constitute a visual rhetoric, a kind of picture-speech that in some cases becomes rapidly conventionalized and stifling (like Instagram) and in other cases opens a whole new realm for creative self-expression (like SnapChat). But in either case, taking a selfie requires, if not photographic or artistic skill, then social skills related to peer-group norms and conventions. Selfie is contextual. It is communicative, and it is subject to norming and discipline. It is fraught, and it is consequential. So let's have no more of this, please:


Monday, May 27, 2013

A New Companion to Digital Humanities, documented!

There's a new edition of the awesome Blackwell Companion to Digital Humanities in the works! Hooray! In early April, I was pleased and honoured to receive an email from co-editor Ray Siemens asking if I would like to contribute a chapter on "Mashups, Remix, and Reuse."

Hell yes, I would, axshully.

I thought I would document the process here.
  • Step one: email back right away and say yes.
  • Step two: miss the first deadline inadvertently by writing the wrong date on my calendar.
The not-incorrectly-writtendown deadlines look like this:
  • 20 April 2013: Responses to invitations received by editors
  • 1 May 2013: Authors submit 250 word chapter abstracts, titles, and 100 word biographical statements to the editors 
  • 1 December 2013: Authors submit chapter drafts to the editors 
  • 1 February 2014: Editors provide comments on the chapters, and make all chapter drafts available for all collection authors to peruse (for purposes of internal citation and cross-referencing) 
  • 1 May 2014: Authors submit final drafts of chapters to the editors, which are then forwarded to the press.
I thought it might be interesting for my graduate students and others to follow along what it looks like to go from 0 to 8000 words in seven months, and from invitation to in-press in one year.

So I've already sent in my response to the invitation, saying yes.

And I've sent in my abstract now. The trick with writing an abstract before you've written the piece in question is to understand that you will get to change it later if you need to. Unless a piece is already written, it is not actually possible to abstract its contents into a capsule summary. But proceed as though you can! Also, it's important to hit the word count. What I like about the abstract I've managed to write is that the main point--the gist of the whole chapter--is in the first two sentences. I had to rewrite for several hours to get it to sound as "well, duh" as it does right now. It takes me about 5 minutes to brute-force type 250 words; it took me about 2 hours to write this abstract and another 15 minutes to get the bio right. Short writing is often harder than longer writing. Go figure. </unsolicited_advice>

Here's what I wrote:

Mashups, Remix, Reuse: New Strategies for Open Scholarship

Remix culture is a popular phenomenon rooted in fandom, crafting new texts from existing cultural artifacts. This chapter considers the deployment of similar creative-reuse techniques in academic research.

Popular culture is increasingly pervaded with remixed book, television, film, music, and native web content destined for new audiences and new contexts. Jedi Knights and Hogwarts students fight Voldemort together in fan fiction. Episodes of My Little Pony are recut and re-scored to turn a story of the magic of friendship into a murderous thriller on YouTube. 80s synth pop is layered against contemporary hip hop to underscore shared beats or to highlight diverging preoccupations, circulating through peer-to-peer networks. Remixes deploy the full range of Internet media to allow fan-producers and fan-consumers to make original content their own.

Academic remixing employs a similar ethic to produce somewhat different kinds of texts that nevertheless, I argue, retain a bit of the joy and sometimes the informality of popular practices. Data visualization techniques like Wordles, open licensing schemes that permit data mining or other repurposing, and the reframing of academic research for media like podcasts or web videos are instances of academic remix strategies. Remix culture in digital humanities and beyond builds on the modular content models of web 2.0, the emerging ethical movement toward open scholarship, and an increasing imperative to actively engage a greater variety and range of audiences, to create new modes of research creation, community engagement, and knowledge dissemination.

(abstract: 245 words; proposed chapter; 8000 wds)

Bio 

Aimée Morrison is an associate professor in the Department of English Language and Literature at the University of Waterloo, where she teaches new media theory and practice. She contributed the chapter “Blogs and Blogging: Texts and Practice” to the Companion to Digital Literary Studies, and teaches a yearly course in multimedia and the social web at the Digital Humanities Summer Institute at the University of Victoria. Her current research on Deciphering Digital Life writing brings auto/biography studies and new media theory to bear on popular web-native life-writing texts, including Facebook, personal mommy blogs, and photo-a-day projects.

(bio: 101 words)

Is it perfect? No. Am I going to change. Almost certainly. But now there's 250 words of text in a file marked "DH Book Chapter" and it give me somewhere to start, in addition to giving the volume editors the info they need to provide to the press to secure reviewers. Win!

Wednesday, March 6, 2013

Academic / Online: All the links that don't work in print

Hello, readers! On March 4, I visited the College of Arts at the University of Guelph, and gave a talk on the topic of "Academic/Online" in their Digital Humanities series. Many thanks to Kathy Hanneson for inviting me!

During that talk I referenced a lot of web sites, and showed them, too, but I thought having a page with all the links might be useful. This is that page..

In the talk, I argued that to be “academic/online” is to push advances across two different fronts.
  1. You need to develop and cultivate a persona that you send into the Internet to act on your behalf in all your scholarly travels. It has to represent you accurately and effectively. And also, we can hope, efficiently. So we’ll talk about that, about being an academic online, in the noun sense (“I am an academic, who has a presence online”). 
  2. Moving beyond the individual, we will also consider what it means to move the academic online, in the adjective sense (“Let’s get my academic research out there, on the intertubes!”). Academic/online in this sense can encompass everything from whole libraries moving online, to workshops or conferences maintaining a hashtag backchannel or putting abstracts online, to new forms of hybrid publication blending the heft of peer review with the utility of online distribution. 
The introductory and concluding portions of the talk did the analytic work: considering the bigger picture implications of these shifts in practice and what they mean for the future of academic work in both idealistic and pragmatic ways. This material is not included here--here it's just links to all the great examples of academic/online that I discussed.

An Academic Online

My own personal footprint as an individual academic / online:
Here's two blog posts by Rohan Maitzen, that I mentioned, which defend blogging as a curiosity-driven intellectual practice and link this to real world practices of dabbling and discovering, or not:
Here's a ProfHacker post by Ryan Cordell I didn't mention, but I should have, on becoming an academic/online--this post links out to many great resources online:

---

The Academic Online

Academic Projects Online

I opened with the example of the multi-pronged project online that is the Quilt Alliance and all its various permutations in different communities and contexts. Here's a sampler, if you will pardon the pun:
These are the academic-based research projects I referenced in the talk:

Academic Publishing Online

Here are some links related to online journals that look more like journals, and less like online :-)
Here are some new kinds and ways of periodical publishing on the web:
How about books? These are going online in new ways, too.

Tuesday, February 5, 2013

New media studies and self disclosure: some questions

Something I’ve noticed from my first New Media Genres graduate seminar this year: in the class introductions there was an abundance of really personal information—informal, private, embarrassing. There were identity disclosures related to sexuality. There were confessions. There were fans. There were passionate doubters. Some polemicizing. I made a joke about how everyone really needed group therapy more than a graduate seminar.

But the joke reveals something important, I think – just like when Linda Warley and I were teaching the Writing the Self Online graduate course, I discover that (like studying auto/biography) studying new media, because it involves using / doing / creating new media both produces passion as a byproduct of research, and then compels the disclosure of this passion in the work. Mommy blogs. Antonio Banderas fan fiction. Pottermore. Anti-Twilight memes. Gay dating / hookup sites. Bikini photos and Google searches that never go away. Due South. LOLcats. Food porn.

Are researchers in a/b studies a little prone to incorporating their own a/b into their work? Yup. It’s probably because they are hyper-aware of the grounding of most narratives in the self, and the grounding of the self, provisionally always, in shifting and competiting narratives, overlapping with and constituting our shared histories. I have noticed that work in this field often blends the practice of life writing with the theorizing of same.

Are researchers in new media studies a little prone to informality, shorthand, fannish expression, multimodal communication practices? And do they root a lot of their work in their own experiences and thus seem to write a lot about themselves? Yeah. It's true in my own work on personal mommy blogging for example--and the piece I've linked, tellingly, found its home in the journal Biography.

Is this a bug, or is it a feature?

I mean, when suchlike happens in a literary classroom it is called the Oprah’s Book Club problem: I don’t care if this novel brought up some long-simmering issue from your childhood and you couldn’t bring yourself to read the ending because the middle made you cry for three days. We may want to study the means by which emotional reactions are brought about. We may want to study some audience effects on a larger scale (Janice Radway and the reading of Reading the Romance) but mostly, that audience is not understood to be the critic doing the critiquing. Identity politics is for cultural studies, not usually for literary reading. (I’m exaggerating, but this is by and large conventional, if not everywhere or in all circumstances or niche endeavours.)

So new media researchers housed in English may well be at some kind of disadvantage because they have to be so personal and so Oprah’s-Book-Club in their own work: it breaches disciplinary standards of critical distance, the focus on form. This is again similar to some of the difficulties faced by auto/biography scholars, at least historically: there is assumed to be no art in the recitation of the facts of a life—nothing to interpret, and so no fit focus for a real academic.

It seems we cannot study new objects—new media, auto/biography writ broad—without fundamentally altering scholarly practices in English (and perhaps elsewhere), challenging values. Even if someone is rooted firmly in the prior art (getting a PhD in English, examining standard literary materials [what a funny, telling locution!]) a serious move into new objects of study seems to inevitably entail a radical shift in perspective, in practice. We are moving from new criticism and its sole focus on the text, to all the flavours of post structuralism and theory that focused on structural questions and power, to ... what has been termed participant ethnography and from there a squishier kind of content analysis based on our own expertise in the "communities" whose texts we are examining?

That’s research--but what about my class? As for how the classroom changes, it seems to me that like in the theory wars (and the focus on race/class/gender and the move toward cultural studies) teachers and students are a lot more exposed as thinking, feeling, variously-positioned subjects. This makes us all vulnerable, and may call for a new kind of teaching practice, a sensitivity and a willingness to make a safe space, somehow.

Wednesday, November 14, 2012

DH Grad Students! What Do You Want?

Seriously now, tell me. I'm currently a Member at Large on the executive of the Canadian Society for Digital Humanities, and I've decided to do some work to help grad students.

I'm not, actually, a grad student, and I haven't been since, err, 2004. But I was a grad student in Digital Humanities (or "Humanities Computing" as we used to call it when I was a youngster) from 1997-2004, over two degrees and in two universities. From the wonderful, invigorating panel discussion on graduate issues in DH at last year's Congress in Waterloo, I learned that ... much has changed since my time, but also, depressingly, other things have not changed at all.

Some issues I discern:

  • Alt-ac: is it really a thing? A desirable thing? How to plan for this career path? How to advocate for this work as "alternatively academic" rather than "alternative to academic"?
  • Dissertations: Should dissertation or completion requirements devised, in my discipline, for literature students investigating their ideas in prose be the same ones we use to assess or train digital humanists? What is the role of the built object in capstone work for DH PhDs? How about programming? Or soldering? 
  • Double-disciplining: It is still true that digital humanists, particularly students, have to do twice the work for half the credit--not just in the dissertation, but in everything. We are expected to have full knowledge of the primary discipline that house and credentials us, as well as full knowledge of the emerging canon of thinking and range of practice of DH.
  • Can you fix my printer / build the department website / manage our Facebook page / explain the cloud to the department? Even DHers hired into academic positions may find themselves being asked to do kinds and numbers of tasks other professors or students are not asked to do.
  • RA work: how is being a research assistant in DH different from standard RA positions? Are there some best practices here? How to explain to others the value or scope of this work, and the skills unique to these positions? (Like project managment, collaboration, technical work, etc.)
  • Unreasonable job ads: "Department of Something seeks expert in Medieval Whatsit, an expertise in composition theory, and a funded multimillion dollar research project in DH." Many hiring departments do not seem to know quite how to hire a DHer into the tenure track, and what it is reasonable to expect. Needs some advocacy?
  • Access to research equipment, datasets, etc. This work often requires high-power tech, beyond the index cards (or Scrivener) that other humanist grad students need to get their work done. How to fund this?
  • Training: Nuff said.
  • Grad programs in DH: what should these look like? Are they a good idea? In what ways and for whom?
What do YOU think needs addressing. And how can I (and by extension, CSDH) help?

Drop some comments; tweet this and share with your network; beat the bushes from some ideas and some interested parties.

Give it your best shot, and I'll give it mine.

Tuesday, November 8, 2011

Whither teh typo? Damn you, Lion autocorrect!

I just upgraded the operating system on my MacBook Pro from Snow Leopard to Lion. Immediately, I notice some big changes. My first reaction is that my computer has got a lot ... faster. Next, I am completely discombobulated by the reversal of the scrolling interface: instead of the direction of my swiping fingers matching the direction of the scroll bar, the new "natural" scrolling works like on my iPad and my iPhone, which is to say it mimics the direction of the content.

Yeah. I turned that off for a couple of days until I felt better able to cope.

The most serious change, unexpectedly, is Lion's vaunted cross-everything autocorrect. I'm not sure I like it. Because it takes my typos and turns them into something new and different.

I like when "teh" gets autocorrected to "the." Not so much, though, when "shit" becomes "shot" or "loooooooveyou" turns into "lollygagging" or "effect" into "effete." As we all know, autocorrect makes some hilarious mistakes, viz http://damnyouautocorrect.com. This is super annoying on the phone, where undoing the auto-'corrections' interrupts the flow of thinking and is in any case sometimes nearly impossible to alter. And between my clumsy thumbs typing things incorrectly, and autocorrect further mangling my intentions? Comes my all-purpose apology / sigfile: "Sent from my iPhone."

"Sent from my iPhone" is my catchall excuse for mangled typing--it's not, I'm trying to tell my readers through this sigfile, that my thinking or my spelling are wrong, it's that my damn thumbs and this damn software just can't capture it very well in a timely fashion.

I always forgive people the weirdness of their "Sent from my iPhone" emails, the same way I cut slack to unintentionally illiterate text messages: people are communicating under suboptimal conditions and I adjust my expectations around grammar, capitalization, punctuation, and flair accordingly.

The problem for me, now, is that autocorrect has jumped from just mangling my text messages and brief emails, to mangling my everything: blog posts, real emails, comments on blogs, and all the rest of it. And with no "Sent from my iPhone" sigfile excusing the strange diction that can sometimes result, I feel weirdly exposed: my understandable occasional typo online turns into inexcusable lapses in vocabulary, sense, spelling with the intrusion of this new "feature."

I find I have to bring a new and different kind of vigilance to my writing online from my main computer now: I used to scan briefly for typos before posting a comment on a blog for example, but now I'm a lot more careful to make sure an autocorrect howler hasn't distorted my meaning in a way that makes me look potentially wackadoo to non-Lion users.

So the upshot? Lion has called an end to the innocent typographical error, and replaced it, I think, with the more insidious semantic errors introduced by an autocorrect that's almost always less smart than the writer it's scolding.

We'll all get used to this in a couple of months, I imagine: we always do. But it's interesting for me to think about the social norms around correctness and errors in online communication, social norms that were maybe not 100% explicitly clear to me until Lion went and disrupted it all.

Sunday, April 10, 2011

What makes a great conference?

I'm not yet home from Theorizing the Web 2011, just sitting in the Starbucks at the Marriott wondering what it is that made this conference so awesome. Because it was awesome: I ran out of paper in my notebook from writing so much down.

I'm thinking it's the grad students.

I go to a lot of conferences and, if I may be frank, was rapidly becoming disenchanted, nay, jaded, about the whole system.

What makes me jaded about conferences:

- The panels are unfocused, so usually there's only one paper of the three--or, god help me, four--that I actually want to see
- There are way, way too many concurrent panels, so that the conference has 400 people at it, but each session has 4 people presenting to 7 other people
- The panels start and finish late, throwing off my eating / peeing / time management
- The papers run overtime
- The speakers read excerpts from an article, so it's too dense and too long, and hard to decipher
- The speakers have prepared their presentations on the plane, and it shows
- The panel chair won't actually stop papers that run too long, so at the end, there's no time at all for questions
- There's never enough break time to actually talk to people and network
- There's no food, so when there are breaks, everyone scatters to the wind: no time to actually talk to people and network
- Sometimes, people won't talk to you if they don't already know you
- Often, people skip out on vast chunks of the conference to do the meeting and networking they otherwise have no time for

What was awesome about TTW2011

- The panels were very focused
- People actually attended panels, and the keynotes, for the whole day. My paper was in the last session of the day, right before the keynote, at 5pm, and there were 40 people attending.
- All the panel rooms were in one hallway, leading to group cohesion and chance conversations
- It was one day long.
- People were very friendly: I suspect this might be because of Twitter, which breaks the ice. If we follow each other online, I'm more likely to walk up and introduce myself, because you've already indicated some sort of interest in my acquaintance, or my work.
- There was food in the morning, and food at night. And a party with a live band and beer in Rubbermaid-bucket coolers.
- Most importantly, perhaps: people put a lot of effort into their presentations, with the result that they were professional, and clear, and on point. Some even managed *funny*.

I am impressed at the level of smarts, research, and preparation of all the talks I went to. Did some people learn the hard way that 42 slides take more than 15 minutes to discuss? Probably. But the slides were really, really good, so I assume that's a rookie mistake of over-ambition, which I'm kinda inclined to forgive readily.

(It's very easy for me to tell other people they went overtime: we only had two speakers on my panel, so co-panelist and I had the luxury of 25-30 minute presentations, with still lots of time for questions--and what great questions!)

So. The upshot is this: I'm going to go to more conferences where graduate students predominate on the program. In new media studies, where something new in the tech or theory or cultural realm pops up on the radar almost faster than the speed of scholarship, it makes sense already to see what the kids are up to. Beyond that, though, I think grad students just put more effort into their papers. As a result, they present better work that's more valuable to me in my own research, and that's the point, right?

Professors, do you think we can step up our game?