The Sensor Society: Invisible Infrastructure

So much about the Internet age has turned out to be vexing–a far cry from my enthusiasm a generation ago, shared by many nerds, for a utopian future of connectivity and a library of one’s dreams.

“I would not open windows into men's souls.” — Attributed to Queen Elizabeth IHard to know where to start with the disappointments and fears, but one that particularly nags is the feeling that we are building (with our eyes closed and tacit consent) an infrastructure that monitors our every move, encasing every one of us in a personal surveillance state, in return for the convenience of carrying a connected device everywhere we go.

Australian Prof. Mark Burdon has termed this the “Sensor Society,” the notion that passively, without our knowledge or consent, and for unknown purposes, everything we do becomes raw data for commercial discovery (and possibly for government snooping). This follows inevitably from the “always on/always connected” world, but is it too high a price to pay?

The entire interview is worth reading, but herewith a few bracing bits:

Q: What are the implications if sensors completely permeate society?

A: Well, it’s not necessarily just about the complete permeation of sensors. Rather, the greater implications regard the emergence of pervasive and always on forms of data collection. The relationship between sensors, the data they produce, and ourselves is important to understand.

For example, sensors don’t watch and listen. Rather, they detect and record. So sensors do not rely on direct and conscious registration on the part of those being monitored. In fact, the opposite is the case. We need to be passive and unaware of the sensing capabilities of our devices for the sensors to be an effective measurer of our activity and our environments.

Our relationship with our devices as sensors is consequently a loaded one. We actively interact with our devices, but we need to be passively unaware of the sensors within our devices. The societal implications are significant—it could mean that everything we do is collected, recorded and analysed without us consciously being aware that such activities are taking place because collection is so embedded in daily life.

Q: How would you recommend someone learn more about the impact of living in a sensor society?

A: Look at your everyday devices in a different way. Behind the device and the sensor are vast and imperceptible, invisible infrastructures. Infrastructures of collection enable the explosion of collectible data and infrastructures of prediction enable understanding and thus give purpose to sensors. Otherwise, sensor-generated data without an analytical framework to understand it is just a mountain of unintelligible data.

The sensor society, therefore, redirects us towards the hidden technological processes that make data collection capture, storage, and processing possible. This, in turn, highlights the importance of understanding relations of ownership and control of sensors and the infrastructures in which sensors operate. So when you’re at home with your devices, realize that you are not alone and just think about those invisible infrastructures that are also present with you. Then question to ask then is: What data is being collected, by whom and for what purpose?

Our metadata, ourselves… how are we ever to be left alone? He’s got a good TedX talk as well.

“[The framers] sought to protect Americans in their beliefs, their thoughts, their emotions and their sensations. They conferred, as against the government, the right to be let alone-the most comprehensive of rights and the right most valued by civilized men.”  — Louis Brandeis

Advertisements

Web Production Tips, Redux

A while back, I got asked for tips on Web production, a little random but offered what it’s worth. Slightly updated from an earlier post, 2/20/2018. Feel free to share with those in the web game.

Random Website Production Tips courtesy of Arthur Smith

  • Sign your work: I’ve found it useful to make sure people put their names (or at least initials) on all concept stage and later docs (specs, treatments, mission statements, wireframes, page designs, whatever). I used to only ask for dates, but “signing your work” may actually improve quality and it is also a godsend later when you are trying to track something down from an archive.
  • Number wireframes: Decimal numbering navigation trees on wireframes/screen comps has been helpful. (That is, assigning a number to each path (Home page =1, a sub-page like “about us” = 1.1 etc.) although as complicated hierarchies become less prevalent on sites, may not be as important.
  • Responsive from the get go: Checking responsive view compatibility really early and being psychologically prepared to clip the wings of a really nice design solution and content strategy if it’s not feasible.
  • Workflow realism: Be prepared for clients’ (or partners) non-use of proposed workflow tools for content, delivery and review etc. To paraphrase Dorothy Parker, you can lead a client to BaseCamp, Google Drive, DropBox, SharePoint or whatever, but you can’t make them click. Most people are reluctant to add another tool to their work ecosystem, and a vendor seldom has the leverage to persuade them to do so. I ask how clients already work with copy, visual assets, and do review for existing projects, and then work out an approach based on that that’s bearable for the web team. It is often email attachments, the bane of any web producer, but then I put them in the tool I need and name them using the production convention. I work on small sites, and I realize something this ad hoc may not scale.
  • Milestones: Rally around a fixed external event to shape client schedules and expectations (even if it’s a sort of made up event). It’s best if there is a plausible date to tie production milestones to, but even if not, designate one based on something on the calendar. For instance, end of a semester for an academic client, annual meeting for a non-profit, etc. Some public version of a deliverable should be tied to this., “e.g., we will focus group the alpha at the next regional meeting.” What’s key is that be such a date be front of mind for client and feasible for you. End of contract dates don’t necessarily work this way. For one thing, they can be amended, but dates of public meetings cannot. If you have to keep looking up a date to see when something major is due, it’s a bad sign.
  • Presentation tools for sketches: Consider Keynote, (or if you must, PowerPoint ) for concept and design stage work. If your designers are comfortable with it and particularly if you are working from a family of templates, have designers create plausible Keynote “page shells” that then can be updated by producers and writers to iterate content strategy. This is sort of a hack, and it’s not what Keynote is for, per se, but it is much more convenient (and less costly) that working on content revisions in a designer/Photoshop workflow. For one thing, text in Keynote is much easier to move around and chunk, and the graphics tools are sufficient at least to wireframe, or even do simple page design. Photoshop (or other image tools, are not text friendly). Don’t leave the designer and developer out of this phase, though. It’s possible to create something in Keynote that is insane to execute. Periodic check ins on feasibility and implications are good, as is working from a shell that both the designer and developer have seen and ok’d, or better participated in.
  • Start early on copy and other content: Also, if you are careful, you can get a jump on cleaning up the copy, and sort out images, rights etc., in this Keynote stage, not just resolve the content strategy problems. Interestingly enough, now that I work mostly in WordPress, I still find that this Keynote workflow helps. WordPress sort of tempts you to think you can do iterative content development if you have chosen the right theme etc. This hasn’t worked for me: it’s a publishing content management approach (presentation and content wrangling) it’s not a content creation tool, at least for anything editorially intensive. I think there still really isn’t a good content creation tool—a sort of github for editorial would be part of it, but some 10 year old is probably creating one. I know people have had success with iA Writer, and Medium is making a big splash, although I’m not sure that it is designed to work inside other workflows.
  • Map editorial workflows: figure out in advance how copy (and other assets) are getting from the client or the writer to the site, and how revisions, proofreading/copyediting and publishing will be done. Note who’s job it is, and do a ‘pre-flight’ to make sure this approach will work. Particularly important for big sites or conversion jobs.
  • Restrain your theme addiction: If you are in a WordPress world, or other theme-based platform, don’t move to the theme choice stage until content strategy is established. (This is really hard for me, and others I’m betting. There are so many themes, and it’s fun to think in a preliminary way about your project and then go “theme shopping”.) The danger is you start thinking about the project in terms of what the theme will do, rather than what the project goals are. (It’s a version of having all those typefaces available in a design program: sort of irresistible to try them out early on.) It’s also a waste of time learning a theme and its quirks, only to find out you won’t use it later. (Although this only happens once in my experience!) If you are building a theme from scratch, or redoing most or all of the CSS, then this would not necessarily apply. Still good to have a clear idea of what success looks like for your content first.
  • Stay in touch: Do a weekly report email to clients, include milestones from your project spreadsheet or PM tool, and color code tasks: on time (green), behind, (red) ahead (blue), not started (black), so it can be glanced at (but not edited).
  • Keep track: Document as you go. (Like flossing easy to say, hard to do).
  • If timelines change, be realistic: If some stage takes a lot more time than it was scheduled and budgeted for, the natural tendency is to believe the “Just So story” that you will “catch up” later. If the first stage took twice as long, it’s likely that the other stages will take twice as long. Make an alternate Plan-B version of the schedule that uses this delay as a factor for all remaining stages. For example, if it took the client 10 weeks instead of 5 to provide copy and signoff on concept spec, multiply all remaining stages (page designs, alpha, testing, whatever) by 2 and see where that gets you. Figure out how you could manage such a schedule, and find a professional and respectful way to address this with your client. This isn’t very comfortable, but even if you are just thinking through it yourself is useful. As difficult as it may be to bring up a delay like this with a client, it is easier than trying miraculously do work in half the time. Even if you end up eating the time, your client and team may well respect you for being realistic about it.
Web site don'ts: Labeling everything in dingbats!
Web production don’ts: Labeling everything in dingbats on your sitemap!

• Love your content: Finally, it’s my experience that if you like the content and do right by it, many workflow problems either don’t occur or are easier to solve. Even neutral professional content that isn’t your particular specialty can be rewarding to work on, and if everybody believes it is valuable to get it out to the people who need it, that gives you some basis for working together and solving problems when they come up.  On the other hand, content that hasn’t been worked out pushes back, and chokes workflows.

The Mother of the Father of the World Wide Web

lees
The web’s grandparents, Mary Lee and Conway Berners-Lee.

The British Library’s Sound and vision blog has a nice piece on Mary Lee Berners-Lee, mother of Tim, who as everybody knows wrote the first spec for what became the World Wide Web, while working at CERN in the late 1980s.

” After studying mathematics at the University of Birmingham, she [MLBL] spent the latter part of the Second World War working at the Telecommunications Research Establish (TRE), the secret centre of Britain’s radar development effort. With the war over she returned to her studies, before leaving Britain for the Mount Stromlo observatory in Australia in 1947, where she worked classifying the spectra of stars. In 1951 she returned to Britain and chanced across an advert for a job at Ferranti in Manchester that would change her life: “I was reading Nature and saw an advertisement one day for – saying, ‘Mathematicians wanted to work on a digital computer.’”

One of many “voices of science” in the Library holdings.

Commonplace Book: Lament for the Blogging & The Internet

Tipped by the always readable Farhad Manjoo, a NYTimes tech writer (with  a good twitter name), I checked out Jia Tolentino’s lament for blogging, in The New Yorker, pegged in part to the closing of The Awl, sort of the blog equivalent of an alternative daily.

It’s a nice piece, although perhaps a bit impredicative in that it (I assume unwittingly) embodies some of the reasons people might not be so interested in blogs any more. Although it oversells a golden age” of the Internet that has been lost (such vanished Edens have always been with us, although perhaps they are disappearing over the horizon faster and faster), there is a sense of fun that has diminished (even for somebody who barely even qualifies as a blogger, like me).

In passing she quotes Alex Balk, writing in 2015. He was a founder of The Awl, and his update to the ‘lament for the makers‘ is bracing:

I have previously shared with you Balk’s Law (“Everything you hate about The Internet is actually everything you hate about people”) and Balk’s Second Law (“The worst thing is knowing what everyone thinks about anything”). Here I will impart to you Balk’s Third Law: “If you think The Internet is terrible now, just wait a while.” The moment you were just in was as good as it got. The stuff you shake your head about now will seem like fucking Shakespeare in 2016.

Perhaps true, but also perhaps ever so, for more than just the Internet. Somewhere I recall a Mark Twain quote, “no matter what the show, the golden age seems to have ended the day before I bought my first ticket”

Answers to Questions You Didn’t Ask

CES (formerly the Consumer Electronics Show) in Las Vegas is the ultimate nerd destination, and yes, once my husband and I made a vacation of it. We learned that it isn’t all that accessible for your ordinary gear head consumers, more B2B and a media showcase, but as an anthropological experience it’s certainly something.

This year’s probably has a full plate of drama, given that everything with a chip in it may have a security vulnerability, but the sad/funny chapter has been inadvertently been penned by a company that created a piece of luggage that follows you around.  Here is what you need to know:

1. It’s called 90Fun’s Puppy 1.

2. They gave to a reporter at the Verge to test.

She gave it a spin:

 So far, all these bags seem more like proof of concepts than gadget of the year. A market exists for suitcases that cater better to those with mobility impairments, but I am not yet convinced this is the solution.

90Fun plans to take preorders for the Puppy 1 during the second half of 2018 in a crowdfunding project, which means it’s got some time to work out the kinks. For now, think of it as an actual, untrained puppy. In theory, it’s cute that a dog will follow you wherever you go, but pair that with the idea of enlisting a puppy to drag your luggage around the airport… and it’s about as useful as it sounds.

Somehow the sight of it falling helplessly to the ground makes me feel for it too…which is completely ridiculous!
 

 

Your Very Own Search Engine

The adage, “Any sufficiently advanced technology is indistinguishable from magic,” attributed to Arthur C. Clarke, seems more applicable than ever.

There is corollary I would like to offer, “Any sufficiently advanced cloud application trailer is indistinguishable from a cheerfully ominous sci-fi novel opening scene.” To wit, I offer,

https://www.unforgettable.me/

This is a service, once hooked into your devices, in particular, your smart phone, and social media feeds, etc. tracks everything and compiles it into your very own search repository. They are in beta and have an explanatory video:

Diaries as they once were.

Brilliant? Creepy? Both? A solution in search of a problem? Or an early sign of how we will someday outsource our memories to our cloud storage lockers?

Aside from the rich ground for fictional (or humorous) speculation. (How do you call tech support to upgrade/correct your own memories? is there a “fish story” plugin?)  it does immediately raise questions about privacy, identity, and invasion/malleability thereof. What happens to this mass of data after somebody dies, is it uploaded into a new person’s avatar. The Sci-Fi possibilities really are endless.

Yet at the same time, before I get my hackles up, all this data is already being collected by the Big Brotherhood, and quite possibly the NSA. Unforgettable.Me is at least offering your data as a service to you, instead of offering you up as raw material for ad revenue and the like.

Another managed quote: “O brave new world that has such data in it.”

Books as Luxury Goods

TechCrunch has an interesting piece on the difference between print and e-books. It is written by Chris Lavergne, identified as “the CEO of Thought.is and the publisher of Thought Catalog” both unknown to me. (And visits to the sites are a bit mystifying–I think I miss some of the context, or are so far from the core audience that it goes over my head.)

But the bit of the article that caught my eye was a discourse about electronic publishing versus real books:

“The medium is indeed the message

We were surprised to learn that print books and digital books were almost two distinct businesses with totally different operating models. While a print book and an e-book share identical content, they reflect diametrically opposed media formats. Print books are luxury goods and e-books are utility, and this has real implications in the strategy and workflow behind the marketing and production of each.

This technical distinction is also present in consumer behavior. E-books — with their instant access and cheap prices — sell generally 6x more quantities than print books for us. That said, a print book will generally generate 7x more revenue than an e-book. It’s hard to generate revenue on an e-book because the whole premise of the platform is: I want this quickly and at the cheapest price possible. The premise of a print book in the digital age is driven by luxury: I read better on paper… or… I like the feeling of turning a page.

You can’t create much markup on utility, whereas you can create a great deal of markup on luxury. This has been perhaps one of the most important insights driving Thought Catalog Books’ growth. The print books department needs to be run like a luxury goods company, while the e-book department needs to be run like a technology company. The content is the same, but the medium dictates an entirely different business model.”

 

This seems plausible (if arguable) to me. I’m not much of an e-book reader, not because I’m opposed to the format, but just because of the long habit of print books, and more cognitive comfort and personal efficiency with them. (But I do read them once in a while, because I don’t buy them they tend to be oddball classics I can get off of Project Gutenberg. Currently it’s Three Men and a Boat, and previously I read News from Nowhere on my IPad, a singularly inappropriate title for an e-reader, given William Morris’ attitudes about technology).

Books aren’t luxury items for me–luckily enough, but I can see that for somebody who was born digital, books, magazines and eventually newspapers too, are prestige items (like the encyclopedia sets of my youth, or the solemn, and generally unread, volumes by Will and Ariel Durant). It’s an odd thought: a once dominant (and for me still far more companionable and effective) medium is now becoming a prestige lifestyle accessory. What could that status mean for libraries and for publishers? And how might ebooks with their different business model (if the article is correct) find an incentive to address access and literacy world-wide, given that more people have access to electronic devices now than have access to toilets according to the U.N.

MOOCs evolve

nypl-digitalcollections-96d61bf8-2d84-f4f7-e040-e00a18065dd3-001-r
Would Abe have been a MOOC student? (Cover of a publication from the International Correspondence Schools, c. 1908).

Now that we are half a decade or so into the MOOC revolution it’s interesting to see it sort out and calm down a bit. Although it hasn’t quite fulfilled the utopian aspirations of the early evangelists, it has provided a useful means to get content to learners (particularly in tech areas).  While it’s unclear how the business models are doing (probably not all that well), people and institutions have benefited.

As somebody who is interested in curriculum, class structure, and the rhetorical forms that educational content take (why 13 weeks? why lectures? etc.), I was puzzled by the slavish effort of MOOCs to reproduce the highly artificial structure of an on-campus course. This seemed to me a clear example of the Marshall McLuhan adage that the first thing that happens with a new medium is that you use it to deliver an old form. (Radio shows were the first thing on TV.)

There still is an excessive amount of ‘course-ness’ to the average MOOC, but Dhawal Shah reports that the format is moving from scheduled semesters to basically on demand. A “Netflix” of education.

He writes, “MOOCs are gradually being transformed from virtual classrooms to a Netflix-like experience. Many courses are no longer offered just once or twice a year, but rather are now available as a self-paced, sign up whenever you want experience Coursera courses are now offered regularly throughout the year, with new sessions starting automatically on a bi-weekly or monthly basis.”

https://www.edsurge.com/news/2016-12-29-monetization-over-massiveness-breaking-down-moocs-by-the-numbers-in-2016

A very welcome development, not just because mapping academic calendar conventions on MOOCs was silly, but because opening up things on demand may lead to content innovation. It happened with Netflix, and helped usher in new blood, and arguably even new formats into fiction and non-fiction television.  Education could do worse…