Lately it seems most of the commentary that stimulates my thinking happens on Twitter. Or maybe it’s because that’s where I spend most of my time. At any rate, a couple of tweets got me thinking about what it means to work “in tech.”
Black media used the term "techie" too loosely. Every Black person who uses a computer and went to SXSW is not a "techie".
— April Davis (@April_Davis) March 15, 2013
It’s becoming increasingly irritating when “women in tech” is applied, it means “women who code,” which is bothersome. Quite bothersome.
— Ho. Ho. Ho. (@pnkrcklibrarian) December 24, 2013
I’m a SXSW veteran, although I haven’t been back in about five years. While I wasn’t ever on a Blacks in Tech panel, I don’t think it’s a stretch for me to say that I was there when the panels were getting off the ground. But I wasn’t a developer, I only puttered around with code in my free time, and my work was largely in the user experience/usability/information architecture sphere.
I went to library school because I was interested in the organization of information in online spaces (and because it was cheaper than biz school). I worked as a graduate student assistant in the department’s IT lab, where I ghosted computers in between every term, set up space on the LAN for students, and provided hardware and software support to my colleagues. Before that, I’d worked in technical support at an internet service provider where I helped people troubleshoot their internet, and later, cable modem connections. I moved into the web hosting department, where having to set up FreeBSD machine from scratch (including the desktop environment of your choice) was a condition of employment.
I’ve written shell scripts, and I’ve been the equivalent of a junior server administrator. These days, I’m an assistant manager in the digital services department at a major urban library. But would you say that I work “in tech”?
Discussions about core technical competencies for librarians and library technical staff are de rigueur these days, and as often happens in many of these discussions, someone makes the claim that it is absolutely necessary that librarians learn to code, because being able to build our own stuff is the future (I guess).
But is it, really?
What if, instead of asking people to only focus on the hard skills of coding/being a maker, we asked people to learn more about ways they can use technology to their advantage? What if we didn’t only privilege production, but we made room for mediation, collaboration, and interpretation? What if we decided that knowing how can we use this technology in the service of our patrons and other staff is just as important a skill as knowing how to using a vendor’s API to build a web app that is tailored for your specific library’s needs?
Knowing how to write — and interpret — code is a solid marketable skill. But so is knowing how to talk tech. Splitting hairs on whether someone who knows how to use a laptop is a real techie is an artificial distinction, and y’all know how I feel about distinctions that keep us apart.
Future-ready librarians won’t (only) know how to code, they’ll know how to speak the language and adapt it in such a way that everyone in their audience – from the most experienced, to the least – will be able to easily understand technology and it’s uses.
Additional reading that inspired this post: “You Don’t Need To Learn To Code & Other Truths About the Future of Careers” by Sean Blanda.comments powered by Disqus