Jeremy Keith's Blog, page 115
May 28, 2014
The more the merrier at Responsive Day Out 2
It’s just a little over four weeks until Responsive Day Out 2: Revenge Of The Width.
The final piece of the line-up has just dropped into place. Dan Donald will be joining us to talk about pushing the browser to achieve what the standards bodies are dragging their heels on. I’m very much looking forward to hearing what he has to share, just as I’m looking forward to hearing all the talks.
I’ve been liaising with each of the speakers to figure out the best way to craft a structured flow for the day. I’m positively giddy with anticipation now. Every one of the talks sounds like a valuable nugget of winningness.
I have some more Responsive Day Out news…
If you already have your ticket, great! I’ll see you on June 27th.
If you don’t already have your ticket, despair not! I’ve been working with the people at the venue to figure out a way of getting some more seats into the venue and I’m happy to report that we’ve been able to expand the capacity.
So grab your ticket now for the bargain price of just £80 plus VAT. They’re aren’t many extra tickets and they probably won’t last too long so get in there.
May 27, 2014
Ten years of dConstruct
Tickets for dConstruct 2014 have been on sale for just over a week now. If you haven’t nabbed yours yet, here’s the URL:
ti.to/clearleft/dconstruct-2014
This will be the tenth dConstruct. Ten years! It’s pretty crazy to look back through the archive and see how the event has evolved from its humble beginnings in 2005 to last year’s magnificent tour-de-force.
If you missed out last year, the videos are all online. You can watch the talks by Sarah Angliss, Maciej Cegłowski, Dan Williams and all the other terrific presenters.
And after you’ve done that, book your place in the Brighton Dome for Friday, September 5th, 2014. Believe me, you don’t want to miss out on this year’s event.
Ten years! Crazy.
It’s kind of fun to look back at the themes for each year:
2005: Web 2.0 — back then, it sorta meant something.
2006: Something Something Web Apps — we were still figuring it out.
2007: Designing the User Experience — this lead directly to the creation of UX London.
2008: Designing the Social Web — a classic!
2009: Designing for Tomorrow — starting to get pretty cerebral.
2010: Design and Creativity — this was a perfect blend; top-notch speakers.
2011: Designing Digital Products — a tricky one; some talks divided opinion.
2012: Playing With The Future — the best conference I’ve ever been to. No exaggeration.
2013: Communicating With Machines — we laughed, we cried, it was bloody brilliant.
2014: Living With The Network — see you there.
Tagged with
dconstruct
conference
brighton
history
dconstruct2014
Have you published a response to this? Let me know the URL:
May 26, 2014
Selfish publishing
I was in Düsseldorf last week, alas just a little too late to catch any of the talks at Beyond Tellarrand …which is a shame, because it looks like Maciej’s talk was terrific. Fortunately, I did get to see a lot of people who were in town for the event, and myself and Maciej were both participating in Decentralize Camp the day after Beyond Tellerrand.
Decentralize Camp had a surprisingly broad scope. As Maciej pointed out during his presentation, there are many different kinds of decentralization.
For my part, I was focusing specifically on the ideas of the indie web. I made it very clear from the outset that was my own personal take. A lot of it was, unsurprisingly, rooted in my relentless obsession with digital preservation and personal publishing. I recapped some of what I talked about at last year’s Beyond Tellerrand before showing some specific examples of the indie web at work: IndieAuth, webmentions, etc.
I realised that my motivations were not only personal, but downright selfish. For me, it’s all about publishing to my own site. That attitude was quite different to many of the other technologies being discussed; technologies that explicitly set out to empower other people and make the world a better place.
Now, don’t get me wrong, I must admit that one of the reasons why I write and talk about the indie web is that in the back of my mind, I’m hoping others will be encouraged to publish on their own websites instead of (or as well as) giving their creative work to third-party sites. As I’ve said before:
…on today’s web of monolithic roach-motel silos like Facebook and Twitter, I can’t imagine a more disruptive act than choosing to publish on your own website.
That said, I’m under no illusions that my actions will have any far-reaching consequences. This isn’t going to change the world. This isn’t going to empower other people (except maybe people who already tech-savvy enough to empower themselves). I’m okay with that.
At Decentralize Camp, I helped Michiel B. de Jong to run a session on IndieMark, a kind of tongue-in-cheek gamification of indie web progress. It was fun. And that’s an important factor to remember in all this. In fact, it’s one of the indie web design principles:
Have fun. Remember that GeoCities page you built back in the mid-90s? The one with the Java applets, garish green background and seventeen animated GIFs? It may have been ugly, badly coded and sucky, but it was fun, damnit. Keep the web weird and interesting.
During the session, someone asked why they hadn’t heard of all this indie web stuff before. After all, if the first indie web camp happened in 2011, shouldn’t it be bigger by now?
That’s when I realised that I honestly didn’t care. I didn’t care how big (or small) this group is. For me, it’s just a bunch of like-minded people helping each other out. Even if nobody else ever turns up, it still has value.
I have to admit, I really don’t care that much about the specific technologies being discussed at indie web camps: formats, protocols, bits of code …they are less important than the ideas. And the ideas are less important than the actions. As long as I’m publishing to my website, I’m pretty happy. That said, I’m very grateful that the other indie web folks are there to help me out.
Mostly though, my motivations echo Mandy’s:
No one owns this domain but me, and no one but me can take it down. I will not wake up one morning to discover that my service has been “sunsetted” and I have some days or weeks to export my data (if I have that at all). These URLs will never break.
Tagged with
indieweb
publishing
decentralizecamp
indie
personal
motivation
indiewebcamp
Have you published a response to this? Let me know the URL:
May 16, 2014
dConstruct tickets
Tickets for dConstruct 2014 go on sale on Monday morning at 11am.
I expect there’ll be quite a rush for tickets initially, but don’t worry—if you aren’t able to get to an internet-enabled device to secure your place the moment that tickets go on sale, rest assured that there’ll be tickets available for quite a while. For the past few years, there have been tickets still available right up until a month before the event itself.
That said, you might as well grab your ticket straight away. You definitely don’t want to miss this year’s event. Just look at that amazing line-up.
Oh, and that line-up just got a little bit more amazing. I’m pleased as punch to announce that Jen Lowe will be joining us for dConstruct. Just one more brilliant and talented person to add to the roster of brilliant and talented people who are going to make this year’s dConstruct something else.
If you’re travelling from outside Brighton, then the first thing you might want to do after securing your dConstruct ticket is to find some accommodation. Here’s a dConstruct page on AirBnB listing plenty of available lodgings.
I recommend sticking around for the weekend after dConstruct too. As well as the annual Maker Faire and the Brighton and Hove Food Festival, there’s going to be plenty of other events happening under the banner of the Brighton Digital Festival.
Brighton is definitely the place to be in the first week of September.
And the dConstruct ticket page is definitely the place to be on Monday morning.
(One thing to note: if you’re buying a whole bunch of tickets for your workmates, please make sure to add a name for each ticket. Don’t worry; you’ll be able to update the names on the tickets at any time up ‘till a couple of weeks before the event itself. So even if you’re not sure now who the final attendees will turn out to be, you can adjust the tickets once you figure it out. But you can’t leave the names blank—if you do, I’m afraid the whole order will be cancelled.)
Tagged with
dconstruct
conference
brighton
dconstruct2014
Have you published a response to this? Let me know the URL:
May 12, 2014
Seams
“The function of science fiction,” said Ray Bradbury, “is not only to predict the future, but to prevent it.”
Dystopias are the default setting for science fiction. It’s rare to find utopian sci-fi, and when you do—as in the post-singularity Culture novels of Iain M.Banks—there’s always more than a germ of dystopia; the dystutopias that Margaret Atwood speaks of.
You’ve got your political dystopias—1984 and all its imitators. Then there’s alien invasion dystopias, machine-intelligence dystopias, and a whole slew of post-apocalyptic dystopias: nuclear war, pandemic disease, environmental collapse, genetic engineering …take your pick. From the cosy catastrophes of John Wyndham to Cormac McCarthy’s The Road, this is the stock and trade of speculative fiction.
Of all these undesirable futures, one that troubles more than any other is the Wall·E dystopia. I’m not talking about the environmental wasteland depicted on Earth. I mean the dystutopia depicted aboard the generation starship The Axiom. Here, humanity’s every need is catered to without requiring any thought. And so humanity atrophies, becoming physically obese and intellectually lazy.
It’s not a new idea. H. G. Wells had already shown us a distant future like this in his classic novel The Time Machine. In the far future of that book’s timeline, humanity splits into two. The savagery of the canabalistic Morlocks is contrasted with the docile passive stupidity of the Eloi, but as Jaron Lanier points out, both endpoints are equally horrific.
In Wall·E, the Eloi have advanced technology. Their technology has been designed according to a design principle enshrined in the title of a Dead Kennedys album: Give Me Convenience Or Give Me Death.
That’s the reason why the Wall·E dystopia disturbs me so much. It’s all-too believable. For many years now, the rallying cry of digital designers has been epitomised by the title of Steve Krug’s terrific book, Don’t Make Me Think. But what happens when that rallying cry is taken too far? What happens when it stops being “don’t make think while I’m trying to complete a task” to simply “don’t make me think” full stop?
Convenience. Ease of use. Seamlessness.
On the face of it, these all seem like desirable traits in digital and physical products alike. But they come at a price. When we design, we try to do the work so that the user doesn’t have to. We do the thinking so the user doesn’t have to. Don’t make the user think. But taken too far, that mindset becomes dangerous.
Marshall McLuhan said that every extension is also an amputution. As we augment the abilities of people to accomplish their tasks, we should be careful not to needlessly curtail what they can do:
Here we are, a society hell bent on extending our reach through phones, through computers, through “seamless integration” and yet all along the way we’re unwittingly losing perhaps as much as we gain. The mediums we create are built to carry out specific tasks efficiently, but by doing so they have a tendency to restrict our options for accomplishing that task by other means. We begin to learn the “One” way to do it, when in fact there are infinite ways. The medium begins to restrict our thinking, our imagination, our potential.
The idea of “seamlessness” as a desirable trait in what we design is one that bothers me. Technology has seams. By hiding those seams, we may think we are helping the end user, but we are also making a conscience choice to deceive them (or at least restrict what they can do).
I see this a lot in the world of web devlopment. We’re constantly faced with challenges like dealing with users on slow networks or small screens. So we try to come up with solutions (bandwidth media queries, responsive images) that have at their heart an assumption that we know better than the end user what they should get.
I’m not saying that everything should be an option in a menu for the user to figure out—picking smart defaults is very much part of our job. But I do think there’s real value in giving the user the final choice.
I remember Jake giving a good example of this. If he’s travelling and he’s on a 3G network on his phone, or using shitty hotel WiFi on his laptop, and someone sends him a link to a video of some cats, he doesn’t mind if he gets the low-quality version as long as he gets to see the feline shenanigans in short order. But if he’s in the same situation and someone sends him a link to the just-released trailer for the new Star Trek movie, he’s willing to wait for hours so that he can watch in high-definition.
That’s a choice. All too often, these kind of choices are pre-made by designers and developers instead of being offered to the end user. We probably mean well, but there’s a real danger in assuming that just because someone is using a particular device that we can infer what their context is:
Mind reading is no way to base fundamental content decisions.
My point is that while we don’t want to overwhelm the user with choice overload, we also need to be careful not to unintentionally remove valuable choices that can empower people. In our quest to make experiences seamless, we run the risk of also making those experiences rigid and inflexible.
The drive for a “seamless experience” has been used to justify some harsh amputations. When Twitter declared war on the very developers it used to champion, and changed its API and terms of service so that tweets had to be displayed the same way everywhere, it was done in the name of “a consistent user experience.” Twitter knows best.
The web is made up of parts and there are seams between those parts: HTML, HTTP, and URLs. The software that can expose or hide those seams is the web browser. Web browsers are made by human beings and it’s the mindset and assumptions of those human beings that determines whether web browsers are enabling or disabling users to make use of those seams.
“View source” is a seam that exposes the HTML lying beneath every web page. That kind of X-ray vision can be quite powerful. Clearly it’s not an important feature for most users, but it is directly responsible for showing people how web pages are made …and intimating that anyone can do it. In the introduction to my first book I thanked “view source” along with my other teachers like Jeff Veen, Steve Champeon, and Jeffrey Zeldman.
These days, browsers don’t like to expose “view source” as easily as they once did. It’s hidden amongst the developer tools. There’s an assumption there that it’s not intended for regular users. The browser makers know best.
There are seams between the technologies that make up a web page: HTML, CSS, and JavaScript. The ability to enable or disable those layers can be empowering. It has become harder and harder to disable JavaScript in the browser. Another little amputation. The browser makers know best.
The CSS that styles web pages can be over-ridden by the end user. This is not a bug. It is a very powerful feature. That feature is being removed:
I understand that vendors can do whatever they want to control how you experience the web, because it is their software, their product, but removing user stylesheets feels sooo un-web to me, which is irony. A browser’s largest responsibility is to give people access to the web. It’s like the web is this open hand, but software is this closed fist.
Then there’s the URL. The ultimate seam.
Historically, browsers have exposed this seam, but now—just as with “view source” and user stylesheets—the visibility of the URL is being relegated to being a power-user tool.
The ultimate amputation.
The irony here is that the justification for this change is not the usual mantra of providing “a more seamless user experience.” Instead, the justification is supposedly security.
This strike me as really strange. Security is the one area where seamlessness is definitely not a desirable characteristic. A secure system requires people to be mindful and aware of their situation. This is certainly true on the web, as Tom points out:
Hiding information away makes me less able to make decisions: it makes me a less informed user.
The whole reason that phishing is a problem is because users don’t pay any bloody attention to what they see in their location bar. Putting less information in the location bar makes the location bar less useful and thus there’s less point paying any attention to it.
Tom has hit on the fundamental mismatch here. Chrome is a piece of software that wants to provide a good user experience—“don’t make me think!”—while at the same trying to make users mindful of their surroundings:
Security requires educated, pro-active, informed thinking users.
Usability is about making the whole process of using the web seamless and thoughtless: a child should be able to do it.
So from the security standpoint, obfuscating the URL is exactly the wrong thing to do.
In order to actually stay safe online, you need to see the “seams” of the web, you need to pay attention, use your brain.
Chrome knows best.
Making it harder to “view source” might seem like an inconsequentail decision. Removing the ability to apply user stylesheets might seem like an inconsequential decision. Heck, even hiding the URL might seem like an inconsequential decision. But each one of those decisions has repercussions. And each one of those decisions reflects an underlying viewpoint.
Make no mistake, all software is political. We talk about opinionated software but really, all software is opinionated, whether we like it or not. Seemingly inconsequential interface decisions are actually reflections of assumptions, biases and beliefs.
As Nat points out, like all political decisions, this is about power:
There’s been much debate about whether the URLs are ‘ugly’ or ‘beautiful’ and whether people really understand them. This debate misses the point.
The URLs are the cornerstone of the interconnected, decentralised web. Removing the URLs from the browser is an attempt to expand and consolidate centralised power.
If that’s the case, then it really doesn’t matter what we think about Chrome removing visible URLs. What appears to be a design decision about the user interface is in fact a manifestation of a much deeper vision. It’s a vision of a future where people can have everything their heart desires without having to expend needless thought. It’s a bright future filled with seamless experiences.
Welcome aboard The Axiom.
Buy n Large knows best.
Tagged with
sci-fi
sciencefiction
future
dystopia
technology
browsers
web
seamful
design
urls
chrome
power
Have you published a response to this? Let me know the URL:
May 4, 2014
Talking and travelling
I’m in America. This is a three-week trip and in those three weeks, I’m speaking at four conferences.
That might sound like a fairly hectic schedule but it’s really not that bad at all. In each place I’m travelling to, travel takes up a day, the conference portion takes up a couple of days, but I still get a day or two to just hang out and be a tourist, which is jolly nice.
This sojourn began in Boston where I was speaking at An Event Apart. It was—as ever—an excellent event and even though I was just speaking at An Event Apart in Seattle just a few weeks ago, there were still plenty of fresh talks for me to enjoy in Boston: Paul talking about performance, Lea talking about colour in CSS, Dan talking about process, and a barnstorming talk from Bruce on everything that makes the web great (although I respectfully disagree with his stance on DRM/EME).
My own talk was called The Long Web and An Event Apart Boston was its final outing. I first gave it at An Event Apart DC back in August—it’s had a good nine-month run.
My next appearance at An Event Apart will be at the end of this American trip in San Diego. I’ll be presenting a new talk there. Whereas my previous talk was a rambling affair about progressive enhancement, responsive design, and long-term thinking, my new talk will be a rambling affair about progressive enhancement, responsive design, and long-term thinking.
Sooner or later people are going to realise that I keep hammering home the same message in all my talks and this whole speaking-at-conferences gig will dry up. Until then, I’ll keep hammering home that same old message.
I have two opportunities to road-test this new talk before An Event Apart San Diego (for which, by the way, tickets still remain: use the code AEAKEITH when you’re booking to get $100 off).
I’ll be speaking at Bmoresponsive in Baltimore at the end of this week. Before that, I have the great pleasure (and pressure) of opening the show tomorrow at the Artifact conference here in good ol’ Austin, Texas (and believe it or not, you can still get a ticket: this time use the code ADACTIO100 when you’re booking to get $100 off).
Until then, I have some time to wander around and be a tourist. It is so nice to be here in Austin when it’s not South by Southwest. I should probably fretting over this talk but instead I’m spending my time sampling tacos and beers in the sunshine.
Tagged with
conference
speaking
travel
america
aneventapart
aeaboston
aeasandiego
artifactconf
bmoresponsive
presentation
boston
austin
baltimore
sandiego
Have you published a response to this? Let me know the URL:
URLy warning
I’m genuinely shocked that Jake thinks that Chrome hiding URLs is a good thing. On the one hand, he says:
The URL is the share button of the web, and it does that better than any other platform. Linkability and shareability is key to the web, we must never lose that…
I absolutely agree with him there. But I very much disagree when he says:
…and these changes do not lose that.
The method he describes for getting at a URL to share is this:
clicking the origin chip or hitting ⌘-L.
Your average user is no more likely to figure out how to do that then they are to figure out how to view source (something that Chrome buried as a “developer” feature some time ago).
Cennydd recently said of URLs:
And I don’t agree that good URLs are beautiful. Even those are an alphabet soup of slashes, dots, two-letter countries, and no spaces.
— Cennydd Bowles (@Cennydd) May 1, 2014
I mostly agree with him. The protocol portion of the URL is pretty pointless, and the domain name and TLD are never what I would describe as “beautiful”. No, when I talk about beautiful URLs, I mean the path that comes after the protocol, domain name, and TLD gumpf …the very bit that Chrome is looking to hide.
URLs are universal. They work in Firefox, Chrome, Safari, Internet Explorer, cURL, wget, your iPhone, Android and even written down on sticky notes. They are the one universal syntax of the web. Don’t take that for granted.
URLs are for humans. Design them for humans.
Of course your average user probably won’t even know what a URL is, and nor should they. But they know what a link is. They know that, until now, they could copy the “link” from the top of their browser and paste it into an email, or a text message, or a word processing document.
If this Chrome experiment goes forward, we can kiss all that goodbye.
The security issue that Jake outlines is that browsers need to make the domain name portion of the URL clearly visible. I hope that the smart folks working on Chrome can figure out a way to do that without castrating the browser’s ability to easily share links.
It’s a classic case of:
Something must be done!
This (killing URLs) is something.
Something has been done.
Technically, obfuscating the URL seems to solve the security issue. But technically, decapitation seems to solve a headache.
Tagged with
chrome
browsers
urls
sharing
links
web
security
Have you published a response to this? Let me know the URL:
April 25, 2014
Analytical
When I was talking about Huffduffer, I mentioned that I don’t have any analytics set up for the site:
To be honest, I’m okay with that—one of the perks of having a personal project is that only metric that really matters is your own satisfaction.
For a while, I did have Google Analytics set up on The Session. But I started to feel a bit uncomfortable about willingly opening up a wormhole between my site and the Google mothership. It bothered me that Adblock Plus would show that one ad had been blocked on the site. There are no ads on the site, but the presence of the Google Analytics code was providing valuable information to Google—and its advertiser customer base—so I can understand why it gets flagged up like any other unwanted tracking.
Theoretically, users have a way of opting out of this kind of tracking by switching on the Do Not Track header (if it isn’t switched on by default). Looking at the default JavaScript code that Google provides for setting up Google Analytics, I don’t see any mention of navigator.doNotTrack.
Now, it may well be that Google sniffs for that header (and abandons any tracking) when its server is pinged via the analytics code, but there’s no way to tell from this side of the Googleplex. I certainly don’t see any mention of it in the JavaScript that gets inserted into our web pages.
I was wondering whether it makes sense to explictly check for the doNotTrack header before opening up that connection to google-analytics.com via a generated script element.
So if the current code looks like:
Would it make sense to wrap it with some kind of test for navigator.doNotTrack:
For the love of mercy, don’t actually use that code—it’s completely untested and probably causes more harm than good. But you can see the idea that I’m trying to get at, right? Google Analytics most definitely counts as tracking so it seems like the ideal use-case for Do Not Track.
It raises a few questions:
Is anyone doing this already? It might well be that the answer is “no”, not because of any reluctance to respect user preferences but because the doNotTrack spec is very much in flux.
Would you consider doing this?
If you were to do this, could you foresee getting pushback from within your own company?
Tagged with
google
analytics
donottrack
privacy
javascript
Have you published a response to this? Let me know the URL:
April 24, 2014
Huffduff up and up
I had a nice Skype chat with Stan Alcorn yesterday all about Huffduffer, online sharing of audio, and all things podcasty and radioish. I’m sure I must have talked his ear off.
Stan was asking about numbers for Huffduffer’s user base and activity. I have to admit that I’ve got zero analytics running on the site. To be honest, I’m okay with that—one of the perks of having a personal project is that only metric that really matters is your own satisfaction. But I told Stan I’d run some quick database queries to get some feeling for Huffduffer’s usage patterns. Here’s what I found…
There are 5,862 people signed up to Huffduffer.
About 150,919 items have been huffduffed. But those aren’t unique files. The total number of distinct files that have been huffduffed is 5,972. That means that, on average, an audio file is huffduffed around 26 times. And the average user has huffduffed around 30 items. But neither of those distributions would be evenly distributed; they’d be power-law distributions rather than bell curves. For example, the most popular file was huffduffed 329 times.
Looking at the ammount of huffduffing done each year, there’s a pleasing upward trend.
1st year
7,382
2nd year
19,080
3rd year
23,403
4th year
31,808
5th year
41,514
I was pleasantly surprised by this. I would’ve assumed that Huffduffer usage would be more of a steady-state affair, but it looks like the site is getting used a bit more with each passing year (the site is currently in its sixth(!) year).
Not that any of that really matters. I built Huffduffer to scratch my own itch. I huffduff an average of 411 audio files each year. So even if nobody else used Huffduffer, it would still provide plenty of value to me.
Like I was saying to Stan, the biggest strength and the biggest weakness of audio—as opposed to text or video—is that you can listen to it while your doing other things. For some people, car journeys are the perfect podcast time. For others, it might be doing the dishes or train journeys. For me, it’s the walk to and from work each day—it takes about 35 minutes each way, and I catch up on my Huffduffer feed during that time.
Jessica and I will often listen to some spoken word audio in the background during dinner—usually something quite radio-y like Radiolab, or NPR stories. Yesterday, we were catching up with Aleks’s BBC documentary series, The Digital Human. It was the episode about voice.
The Digital Human: Voice on Huffduffer
Imagine my surprise when I heard the voice of Stan Alcorn. What a co-inky-dink!
Tagged with
huffduffer
audio
stats
analytics
Have you published a response to this? Let me know the URL:
April 23, 2014
Announcing dConstruct 2014
I’ve been puttin together the website for this year’s dConstruct and I reckon it’s in a decent enough shape to ship, so without further ado, I present to you…
dConstruct 2014 — Living With The Network
Here’s what you need to know:
dConstruct 2014 takes place on September 5th in the Brighton Dome.
Tickets will cost £150+VAT.
Tickets go on sale at 11am on May 19th.
It will be bloody brilliant.
To clarify that last point, it will be bloody brilliant because of the people who will be speaking. Like, ooh, I don’t know …Warren Fucking Ellis!
Mandy Brown!, Aaron Straup Cope!, Clare Reddington!, Tom Scott!, Leila Johnston!, Brian Suda!…
I’m ludicrously excited about the line-up for this year’s event, and what’s on the website isn’t even the full roster; there’s more to come. But I can’t contain my excitement any longer and I just have to share this with everyone.
Now, you may not recognise every name on the line-up. Heck, you may not recognise any the names on the line-up. But if you were at dConstruct last year (or the year before) than I hope I’ve earned your trust. And trust me, this is going to be a fantastic day.
So put Monday, May 19th in your calendar so you can grab your ticket when they go on sale (don’t worry—there’s plenty to go around). And put Friday, September 5th in your calendar and I’ll see in the Brighton Dome for the event of the year.*
*Not hyperbole
Tagged with
dconstruct
dconstruct2014
conference
brighton
Have you published a response to this? Let me know the URL:
Jeremy Keith's Blog
- Jeremy Keith's profile
- 55 followers
