Jeremy Keith's Blog, page 150
January 3, 2011
DOM Scripting, second edition
You may have noticed that there's a second edition of DOM Scripting out. I can't take any credit for it; I had very little to do with it. But I'm happy to report that the additions meet with my approval.
I've written about it on the DOM Scripting blog if you want all the details on what's new. In short, all the updates are good ones for a book that's now five years old.
If you've already got the first edition, you probably don't need this update, but if you're looking for a good introduction to JavaScript with a good smattering of Ajax and HTML5, and an additional dollop of jQuery, the second edition of DOM Scripting will serve you well.
The cover of the second edition, alas, is considerably shittier than the first edition. So don't judge the second edition of a book by its cover.
Tagged with
book
domscripting
writing
publishing
javascript
January 1, 2011
Reading the street
Like many others, I was the grateful recipient of a Kindle this Christmas. I'm enjoying having such a lightweight reading device and I'm really enjoying the near-ubiquitous free connectivity that comes with the 3G version.
I can't quite bring myself to go on a spending spree for overpriced DRM'd books with shoddy layout and character encoding, so I've been getting into the swing of things with the freely-available works of Cory Doctorow. I thoroughly enjoyed For The Win—actually, I read that one on my iPod Touch—and I just finished Makers on the Kindle.
The plot rambles somewhat but it's still an entertaining near-future scenario of hardware hackers creating and destroying entire business models through the ever-decreasing cost and ever-increasing power of street-level technology.
Cracking open the case of a particularly convincing handset, he offers advice on identifying a fake: a hologram stuck on the phone's battery is usually a good indication that the product is genuine. Two minutes later, Chipchase approaches another stall. The shopkeeper, a middle-aged woman, leans forward and offers an enormous roll of hologram stickers.
Chipchase, mouth agape, takes out the Canon 5D camera that he uses to catalogue almost everything he sees. "What are these for?" he asks, firing off a dozen photographs in quick succession. "You stick them on batteries to make them look real," she says, with a shrug. Chipchase smiles, revelling in the discovery. "I love this!" he yelps in delight, and thanks the shopkeeper before heading off to examine the next stall.
That isn't a passage from Makers. That's from a Wired magazine article by Bobbie: a profile of Jan Chipchase and his predilection for Shanzai; counterfeit electronic goods on the streets of Shanghai …not unlike the Bambook Kindle clone.
December 31, 2010
Twenty Ten
…another year over, and what have you done?
Well, quite a bit actually, Mr. Lennon.
In 2010 Jessica and I moved into our new home in the Elm Grove area of Brighton. It's a really nice place in a quiet neighbourhood and it lies at the top of a fairly steep hill, which may be of benefit to my physical condition. This is also the first place that we're not renting. We've got a mortgage now, which technically puts us in the category of being homeowners …although it's actually the bank that owns it.
In 2010 I became a cyborg. I've been wearing glasses since September. They're especially handy for conference halls and cinemas.
In 2010 Salter Cane released their second album, Sorrow. Modesty forbids me naming it album of the year. That accolade undoubtedly goes to High Violet by The National (and film of the year undoubtedly goes to Inception).
In 2010 my third book was published. I'm very proud of it.
In 2010 I spoke at An Event Apart five times in five different cities in the US. I loved every minute of it.
In 2010 I compèred dConstruct. It was the best yet.
In 2010 I attended the wedding of Simon and Nat's wedding and officiated at Cindy and Matt's wedding. They were joyous occasions.
In 2010 I organised the world's first Science Hack Day in London and attended the world's second Science Hack Day in San Francisco. Both events were indescribably excellent.
Bring on 2011.
Tagged with
2010
December 27, 2010
Tweaking Huffduffer
Because I was so busy, the two-year anniversary of Huffduffer passed unnoticed back in October. Two years! It's hard to believe. It seems like just yesterday that I launched it. It's been ticking along nicely for all that time and I've been tweaking it whenever I get the chance.
I recently added oEmbed support. I'm very impressed with the humble little format. It's basically a unified API onto the multiple embed codes provided by so many websites. You request a URL from an endpoint such as http://huffduffer.com/oembed?url= and you get back a JSON (or XML) file with the details of the HTML you need to embed the content—video, photo, whatever. Something like http://huffduffer.com/oembed?url=http://huffduffer.com/adactio/32454
YouTube, Flickr, Vimeo and whole host of other sites support oEmbed and the Embedly service provides easy access to all of them. Now Huffduffer is listed amongst the 160 oEmbed providers supported by Embedly. Maybe if I make the right ritual sacrifices, perhaps Huffduffer players might start showing up in New Twitter: it uses a combination of oEmbed and a whitelist to display third-party content in the side panel.
I made some tweaks to the front-end of Huffduffer recently too. For starters, you might notice that the body copy font size has been bumped up from fourteen pixels to sixteen. While fourteen pixels is perfectly fine for Helvetica or Georgia, it's just that little bit too small for Baskerville.
While I was in there messing around with the CSS, I took the opportunity to tweak the small screen rendering.
For a start, I changed the way that the media queries are executed. Instead of beginning with the wide-screen "desktop" layout as the default and then undoing the widths and floats for smaller screens, I'm now using the same technique I've tried out here on adactio.com: begin with a linear layout-less flow and only add widths and floats within media query blocks. That way, mobile devices that don't support media queries will still get the linearised view.
The elephant in the room is, once again, Internet Explorer (below version nine, anyway). While I can quite merrily say "screw 'em" here on adactio.com, I need to make more of an effort for Huffduffer. So I split up my CSS into two files: a global.css file that contains all the typography and colour rules, and layout.css that contains a default wide-screen "desktop" view followed by media queries narrower widths. This is how I'm calling both stylesheets:
See how the layout.css file is being called twice? Once for browsers that support media queries (with a browser width wider than thirty ems) and again for Internet Explorer less than version nine.
Mobile devices that don't support media queries or conditional comments will never load the layout.css file. Browsers that do support media queries, be they mobile or desktop, will only execute layout.css if the viewport is at least thirty ems wide. Legacy versions of Internet Explorer will always load layout.css because of the conditional comment. It's entirely possible that Windows Mobile 7 will also load layout.css because the browser is currently using an IE7 codebase (Trident 3.1, to be precise). Screw 'em …at least until Microsoft bring out an update.
The disadvantage of this technique is that my CSS is now split up into two separate files. I'd much rather keep HTTP requests to a minimum by having just one style sheet, but I think that, in this case, the reward in cross-browser compatibility is worth the expense of that extra hit.
While I was testing the changes, I noticed something interesting on my iPod Touch when I was at the Clearleft office, where we have the stereo connected to the WiFi network. The most recent iOS update allows you to stream directly from your device to your stereo or television. What I didn't realise was that this is true of any media, including HTML5 video and audio content in a web page. Nice!
Tagged with
huffduffer
oembed
typography
responsive
design
mediaqueries
css
December 19, 2010
Home-grown and Delicious
I've been using Delicious since 2005—back when it was del.icio.us. I have over 2,000 bookmarks stored there. I moved to Magnolia for a while but we all know how that ended.
Back then I wrote:
Really, I should be keeping my links here on adactio.com, maybe pinging Delicious or some other social bookmarking site as a back-up.
Recently Delicious updated its bookmarklet-conjured interface, not for the better. I thought that I could get used to the changes, but I found them getting more annoying over time. Once again, I began to toy with the idea of self-hosting my bookmarks. I even exported all my data into a big XML file.
The very next day, some of Yahoo's shit hit the web's fan. Delicious, it was revealed, was to be sunsetted. As someone who doesn't randomly choose to use meteorological phenomena as verbs, I didn't know what that meant, but it didn't sound good.
As the twittersphere erupted in anger and indignation, I was able to share my recently-acquired knowledge:
curl https://{your username}:{your password}@api.del.icio.us/v1/posts/all to get an XML file of your Delicious bookmarks.
A lot of people immediately migrated to Pinboard, which looks like an excellent service (and happens to be the work of Maciej Ceglowski, one of the best bloggers ever to put pixels to screen).
After all that, it turns out that "sunsetting" doesn't mean "shooting in the head", it means something more like "flogging off", as clarified on the Delicious blog. But the damage had been done and, anyway, I had already made up my mind to bring my bookmarks in-house, so I began a fun weekend of hacking.
Setting up a new section of the site for links and importing my Delicious bookmarks was pretty straightforward. Creating a bookmarklet was pretty easy too—I already some experience of that with Huffduffer.
So now I'll do my bookmarking right here on my own site. All's well that ends well, right?
Well, not quite. Dom sounded a note of concern:
sigh. There goes the one thing I actually used delicious for, the social network. :(
Paul also pointed to the social aspect as the reason why he's sticking with Delicious:
Personally, while I've always valued the site for its ability to store stuff, what's always made Delicious most useful to me is its network pages in general, and mine in particular.
But it's possible to have your Delicious cake and eat it at home. The Delicious API makes it quite easy to post links so I've added that into my own bookmarking code. Whenever I post a link here, it will also show up on my Delicious account. If you're subscribed to my Delicious links, you should notice no change whatsoever.
This is exactly what Steven Pemberton was talking about when I liveblogged his XTech talk two years ago. Another Stephen, the good Mr. Hay, summed up the absurdity of the usual situation:
For a while we've posted our data all over the internet on all types of services. These services provide APIs so we can access the data we put into them, so that we can do things with that data. Read that again.
Now I'm hosting the canonical copies of my bookmarks, much like Tantek hosts the canonical copies of his tweets and syndicates them out to Twitter. Delicious gets to have my links as well, and I get to use Delicious as a tool for interacting with my data …only now I'm not limited to just what Delicious can offer me.
Once I had my new links section up and running, I started playing around with the Embedly API (I recently added the excellent oEmbed format to Huffduffer and I was impressed with its power). Whenever I bookmark a page with oEmbed support, I can pull content directly into my site. Take a look at the links I've tagged with "sci-fi" to see some examples of embedded Vimeo and Flickr content.
I definitely prefer this self-hosting-with-syndication way of doing things. I can use a service like Delicious without worrying about it going tits-up and taking all my data with it. The real challenge is going to be figuring out a way of applying that model to Twitter and Flickr. I'm curious to see which milestone I'll hit first: 10,000 tweets or 10,000 photos. Either way, that's a lot of my content on somebody else's servers.
Tagged with
delicious
links
syndication
preservation
oembed
bookmarks
December 9, 2010
One web
I was in Dublin recently to give a little talk at the 24 Hour Universal Design Challenge 2010. It was an interesting opportunity to present my own perspective on web design to an audience that consisted not just of web designers, but designers from many different fields.
I gave an overview of the past, present and future of web design as seen from where I'm standing. You can take a look at the slides but my usual caveat applies: they won't make much sense out of context. There is, however, a transcript of the talk on the way (the whole thing was being captioned live on the day).
Towards the end of my spiel, I pointed to Tim Berners-Lee's recent excellent article in Scientific American, Long Live the Web: A Call for Continued Open Standards and Neutrality:
The primary design principle underlying the Web's usefulness and growth is universality. When you make a link, you can link to anything. That means people must be able to put anything on the Web, no matter what computer they have, software they use or human language they speak and regardless of whether they have a wired or wireless Internet connection. The Web should be usable by people with disabilities. It must work with any form of information, be it a document or a point of data, and information of any quality—from a silly tweet to a scholarly paper. And it should be accessible from any kind of hardware that can connect to the Internet: stationary or mobile, small screen or large.
We're at an interesting crossroads right now. Recent developments in areas like performance and responsive design means that we can realistically pursue that vision of serving up content at a URL to everyone to the best ability of their device. At the same time, the opposite approach—creating multiple, tailored URLs—is currently a popular technique.
At the most egregious and nefarious end of the spectrum, there's Google's disgusting backtracking on net neutrality which hinges on a central conceit that spits in the face of universality:
…we both recognize that wireless broadband is different from the traditional wireline world, in part because the mobile marketplace is more competitive and changing rapidly. In recognition of the still-nascent nature of the wireless broadband marketplace, under this proposal we would not now apply most of the wireline principles to wireless…
That's the fat end of the wedge: literally having a different set of rules for one group of users based on something as arbitrary as how they are connecting to the network.
Meanwhile, over on the thin end of the wedge, there's the fashion for serving up the same content at different URLs to different devices (often segregated within a subdomain like m. or mobile.—still better than the crack-smoking-inspired .mobi idea).
It's not a bad technique at all, and it has served the web reasonably well as we collectively try to get our heads around the expanded browser and device landscape of recent years …although some of us cringe at the inherent reliance on browser-sniffing. At least the best practice in this area is to always offer a link back to the "regular" site.
Still, although the practice of splintering up the same content to separate URLs and devices has been a useful interim step, it just doesn't scale. It's also unnecessary.
Most of the time, creating a separate mobile website is simply a cop-out.
Hear me out.
First of all, I said "most of the time." Maybe Garrett is onto something when he says:
It seems responsive pages are best for content while dedicated mobile pages are best for interactive applications. Discuss.
Although, as I pointed out in my brief list of false dichotomies, there's no clear delineation between documents and applications (just as there's no longer any clear delineation between desktop and mobile).
Still, let's assume we're talking about content-based sites. Segregating the same content into different URLs seems like a lot of work (quite apart from violating the principle of universality) if all you're going to do is remove some crud that isn't necessary in the first place.
As an example, here's an article from The Guardian's mobile site and here's the same article as served up on the www. subdomain.
Leaving aside the way that the width is inexplicably set to a fixed number of pixels, it's a really well-executed mobile site. The core content is presented very nicely. The cruft is gone.
But then, if that cruft is unnecessary, why serve it up on the "desktop" version? I can see how it might seem like a waste not to use extra screen space and bandwidth if it's available, but I'd love see an approach that's truly based on progressive enhancement. Begin with the basic content; structure it to best fit the screen using media queries or some other form of feature detection (not browser detection); pull in extra content for large-screen user-agents, perhaps using some form of lazy loading. To put it in semantic terms, if the content could be marked up as an aside, it may be a prime candidate for lazy loading based on device capability:
The aside element represents a section of a page that consists of content that is tangentially related to the content around the aside element, and which could be considered separate from that content.
I'm being unfair on The Guardian site …and most content-based sites with a similar strategy. Almost every site that has an accompanying mobile version—Twitter, Flickr, Wikipedia, BBC—began life when the desktop was very much in the ascendency. If those sites were being built today, they might well choose a more responsive, scalable solution.
It's very, very hard to change an entire existing site. There's a lot of inertia to battle against. Also, let's face it: most design processes are not suited to solving problems in a device-independent, content-out way. It's going to be challenging for large organisations to adjust to this way of thinking. It's going to be equally challenging for agencies to sell this approach to clients—although I feel Clearleft may have a bit of an advantage in having designers like Paul who really get it. I think a lot of the innovation in this area will come from more nimble quarters: personal projects and small startups, most likely.
37 Signals recently documented some of their experiments with responsive design. As it turned out, they had a relatively easy time of it because they were already using a flexible approach to layout:
The key to making it easy was that the layout was already liquid, so optimizing it for small screens meant collapsing a few margins to maximize space and tweaking the sidebar layout in the cases where the screen is too narrow to show two columns.
In the comments, James Pearce, who is not a fan of responsiveness, was quick to cry foul:
I think you should stress that building a good mobile site or app probably takes more effort than flowing a desktop page onto a narrower piece of glass. The mobile user on the other side will quite possibly want to do different things to their desktop brethren, and deserves more than some pixel shuffling.
But the very next comment gets to the heart of why this well-intentioned approach can be so destructive:
A lot of mobile sites I've seen are dumbed down versions of the full thing, which is really annoying when you find that the feature you want isn't there. The design here is the same site adapted to different screens, meaning the end product doesn't lose any functionality. I think this is much better than making decisions for your users as to what they will and won't want to see on their mobile phone.
I concur. I think there's a real danger in attempting to do the right thing by denying users access to content and functionality "for their own good." It's patronising and condescending to assume you know the wants and needs of a visitor to your site based purely on their device.
The most commonly-heard criticism of serving up the same website to everyone is that the existing pages are too large, either in size or content. I agree. Far too many URLs resolve to bloated pages locked to a single-width layout. But the solution is not to make leaner, faster pages just for mobile users; the answer is to make leaner, faster pages for everybody.
Even the brilliant Bryan Rieger, who is doing some of the best responsive web design work on the planet with the Yiibu site, still talks about optimising only for certain users in his otherwise-excellent presentation, The End of Unlimited Bandwidth.
When I was reading the W3C's Mobile Web Best Practices, I was struck by how much of it is sensible advice for web development in general, rather than web development specifically for mobile.
This is why I'm saying that most of the time, creating a separate mobile website is simply a cop-out. It's a tacit acknowledgement that the regular "desktop" site is beyond help. The cop-out is creating an optimised experience for one subset of users while abandoning others to their bloated fate.
A few years back, there was a trend to provide separate text-only "accessible" websites, effectively ghettoising some users. Nowadays, it's clear that universal design is a more inclusive, more maintainable approach. I hope that the current ghettoisation of mobile users will also end.
I'm with Team Timbo. One web.
Tagged with
mobile
responsive
design
performance
oneweb
accessibility
December 4, 2010
Postscript to Space
One of the mailing lists I subscribe to is the Brighton Speculative Fiction group. If I rightly recall, I signed up whilst drunk at a party I had gatecrashed in Kemptown.
What? Like it's never happened to you. I suppose you've never woken up the morning after the night before, clutching your aching head and moaning "Oh man, I hope I didn't edit any wikis last night!"
Anyway.
The Brighton Speculative Fiction group meets regularly in the excellent Basketmaker's Arms to talk sci-fi and swap books. My copy of The Demolished Man is making the rounds while I've snagged a copy of one of Arthur C. Clarke's earliest works, Prelude To Space. It reads like an alternative history novel, imagining what it would have been like if the space race had been led from the UK rather than the US.
Early on the book, a character explains that peculiarly British word "boffin":
Good lord, don't you know that word? It goes back to the War, and means any long-haired scientific type with a slide-rule in his vest pocket.
That reminded me of the thoroughly enjoyable book Backroom Boys by Francis Spufford, filled with stories of post-war British innovation: everything from "spitfires in space" rocketry ambitions through to the creation of Elite and Vodaphone.
But when Clarke published Prelude To Space in 1953, the idea of Britain leading the charge into space wasn't a far-fetched flight of fancy. If anything, it was a straightforward linear extrapolation. Before the PR battle of the superpowers kicked off with Sputnik, America had shown no interest in spaceflight, much less putting men on the moon.
I know this, not just because Arthur C. Clarke mentions it in the foreword, but also because of the first episode of the Space Dog podcast which features an interview with Arthur C. Clarke gleaned from The Science Fiction Oral History Association, wherein he talks about Prelude To Space.
Space Dog Podcast Episode One on Huffduffer
In fact, I've been huffduffing a host of Arthur C. Clarke-related material lately. The motherlode is this three-way discussion with Clarke, Margaret Mead and Alvin Toffler on 2001: A Space Odyssey, technology, and the future of mankind. They discuss the idea of the singularity without explicitly calling it that—this was recorded long before Vernor Vinge coined the term.
Arthur C. Clarke, Alvin Toffler, and Margaret Mead on Man's Future on Huffduffer
Listening to this on my iPod on my walk into work, I had a pleasant tingle of recognition when Alvin Toffler, author of Future Shock, was introduced as a consultant to the Institute For The Future …the organisation that provided the location for a latter-day gathering of web-enabled boffins: Science Hack Day San Francisco.
Tagged with
sci-fi
sciencefiction
space
audio
huffduffer
December 2, 2010
A brief list of false dichotomies
In the world of web development, there are many choices that are commonly presented as true or false, black and white, Boolean, binary values, when in fact they exist in a grey goo of quantum uncertainty. Here are just three of them…
Accessible and inaccessible
Oh, would that it were so simple! If accessibility were a "feature" that could be controlled with the flick of a switch, or the completion of a checklist, front-end web development would be a whole lot easier. Instead it's a tricky area with no off-the-shelf solutions and where the answer to almost every question is "it depends." Solving an accessibility issue for one set of users may well create problems for another. Like I said: tricky.
Documents and applications
Remember when we were all publishing documents on the web, but then there was that all-changing event and then we all started making web apps instead? No? Me neither. In fact, I have yet to hear a definition of what exactly constitutes a web app. If it means any URL that allows the user to interact with the contents, then any document with the slightest smattering of JavaScript must be considered an application. Yet even on the desktop, applications like email clients, graphics programs and word processors are based around the idea of creating, editing and deleting documents.
Perhaps a web app, like obscenity, cannot be defined but can only be recognised. I can point to Gmail and say "that's a web app." I can point to a blog post and say "that's a document." But what about a Wikipedia article? It's a document, but one that I or anyone else can edit. What about Twitter? Is it a collection of documents of fewer than 140 characters, or is it a publishing tool?
The truth is that these sites occupy a sliding scale from document to application. Rather than pin them to either end of that scale, I'm just going to carry on calling them websites.
Desktop and mobile
The term "mobile phone" works better than "cell phone" because it defines a phone by its usage rather than a particular technology. But it turns out that the "phone" part is just as technologically rigid. Hence the rise of the term "mobile device" to cover all sorts of compact Turing machines, some of which just happen to be phones.
In web development, we speak now of "designing for mobile," but what does that mean? Is it literally the context of not being stationary that makes the difference? If so, I should be served up a different experience when I use my portable device in the street to when I use the same device sitting down in the comfort of my home or office.
Is it the bandwidth that makes the difference? That would imply that non-mobile devices don't suffer from network scarcity. Nothing could be further from the truth. Performance is equally important on the desktop. It may even be more important. While a user may expect a certain amount of latency when they're out and about, they are going to have little patience for any site that responds in a less than snappy manner when they're at home connected with a fat pipe.
Is it screen size that matters? That would make my iPod Touch a mobile device, even though whenever I'm surfing the web on it, I am doing so over WiFi rather than Edge, 3G, or any other narrow network. What about an iPad? Or a netbook? When I was first making websites, the most common monitor resolution was 640 by 480 pixels. Would those monitors today be treated as mobile devices simply because of the dimensions of the screen?
I'll leave the last word on this to Joaquin Lippincott who wrote this in a follow-up comment to his excellent article, Stop! You are doing mobile wrong!:
Devices really should be treated as a spectrum (based on capabilities) rather than put into a mobile vs. desktop bin.
Tagged with
web
development
accessibility
mobile
December 1, 2010
Adfonting
It's the start of the Christmas season. I know it's the start of the Christmas season, not just because Brighton is currently blanketed in snow, but also because 24 Ways—the advent calendar for geeks—has kicked off with its first article. Hurray! And this year, all of the articles will be available as a book from Five Simple Steps for a mere £8, with all the proceeds going to charity. Grab a copy before the end of December because this is a time-limited offer.
This year, 24 Ways isn't the only advent calendar for geeks. While I was off galavanting up and down the west coast of the US last month, my cohorts at Clearleft were scheming up a little something special: an advent calendar for fonts. Every day, for 24 days, release a Fontdeck font for one year's free use.
When they told me, I thought "great idea!" …then they told me they were going to call it an "adfont" calendar and there was much groaning and gnashing of teeth.
The Adfont Calendar 2010 (groan) is now live.
The lovely visual design comes courtesy of Michelle, the latest addition to the Clearleft team, and it mimics a type case; just like the one we happen to have in the office. Every office needs a type case.
Originally, the interface was going to be one looooong type case with some JavaScript layered on top to allow smooth horizontal navigation. But when Rich asked me for some advice on implementing it, I steered him down a different path. Instead of displaying everything horizontally, why not use media queries to show just enough drawers to fit the user's browser window and allow the rest to stack vertically?
I didn't think he'd take my challenge seriously but he's only gone and bloody done it!
Have a poke around and see what's behind drawer number one.
Tagged with
adfont
calendar
fontdeck
responsive
design
mediaqueries
css
November 30, 2010
Spacelogging
When I was gushing enthusiastically about Old Weather, I tried (and failed) to explain what it is that makes it so damn brilliant. I've just experienced some of that same brilliance. This time the source is Spacelog:
Read the stories of early space exploration from the original NASA transcripts. Now open to the public in a searchable, linkable format.
You can now read the transcripts from the Apollo 13 and Mercury 6 missions, and every single utterance has a permalink. For example:
The beauty of the idea is matched in the execution. Everything about the visual design helps to turn something that was previously simply information into an immersive, emotional experience. It's one thing to know that these incredible events took place, it's another to really feel it.
Spacelog shares the spirit of Science Hack Day. It's a /dev/fort creation, put together in an incredibly short period of time; Norm! has the low-down.
Apollo 13 and Mercury 6 are just the start. If you want to help turn more transcripts into an emotionally engaging work of hypertext, everything is available under a public domain license and all the code is available on Github. Transcripts are available for Gemini 6, Apollo 8, and Apollo 11.
I can't wait to read Charlie Duke as hypertext.
Jeremy Keith's Blog
- Jeremy Keith's profile
- 55 followers
