The Ending of Rigid Time

A section of the Prague astronomical clock.

A section of the Prague astronomical clock.

The history of humanity is the history of tracking and managing information. Need to grow enough wheat or rice to feed your village? Then you must track the seasons to determine the best times to plant your seeds. Want to remember society's history so you don’t continually repeat it, or determine how many of your kingdom’s subjects pay their taxes? Then you’ll need systems of writing and counting.

And central among humanity’s informational tools is the measurement of time.

Exactly how humanity began tracking time is lost in the unrecorded millennia of human prehistory. But it’s safe to assume that before recorded history began our ancestors tracked time. For example, there's ample archaeological evidence that ancient hunter-gatherers tracked the seasons. In addition, sites like Stonehenge take account of celestial time indicators such as the solstices. Our ancestors obviously knew how to track time well before recorded history began.

But as more complex civilizations developed so too developed the need for specific methods of measuring time for religious ceremonies, trade, and military and administrative duties. So at some point in our history humans began to track the individual parts of each day.

One way we did this was by measuring the passage of time using recurring natural phenomenon. The passage of the sun through the sky. The dripping of water through a small hole. The resulting devices — shadow clocks, sundials, water clocks, hourglasses — flourished for centuries. Now we track time with technologies up to and including atomic clocks, which measure the frequency with which electrons move from one energy level to another. With atomic clocks we have reached the point where we track time with far more precision than humans can notice with our physical senses.

But precision comes at a price: the chaining of a sizable portion of humanity to an extremely rigid form of time. You must be at school at this exact time. You must begin and end work at other exact times. You must never be late because that is disrespectful to our notions of time management.

Taken to the extreme this sense of absolute time management results in the dystopian world of Harlan Ellison’s classic science fiction story "‘Repent, Harlequin!’ Said the Ticktockman,” where everyone must obey a rigid time schedule throughout their lives or suffer the consequences.

Thanks to technology, I suspect Ellison’s time-keeping nightmare won’t come true. Instead, it’s possible we’re about to enter a period which we might well call the ending of rigid time.

But whether this will be good or bad has yet to be determined.

To understand the pending change in how humanity uses time, look in your pocket. Is a smartphone hiding there? If so you have access to systems which can chart and manage every aspect of your life — the digital equivalent of a personal assistant who ensures you never miss an appointment or your kid’s birthday.

And we’re merely at the beginning of such interconnected personal scheduling. In the near future this type of technology will no doubt evolve until some aspect of wearable or implantable tech allows your personal assistant to whisper in your ear or access your mind during every moment of your life. Spent a few minutes setting up your life preferences and you won’t need to worry about when to work or go to school or study or exercise or have a beer with friends.

In such an integrated scheduling system, why would the average human need to even pay attention to time? After all, your personal assistant will be able to connect with an entire world of other people’s lives and schedules. Tell your assistant to schedule a doctor’s visit and the assistant will contact the doctor’s office, determine when you’re available, schedule the appointment, and even tell you when to leave your home and stick out your tongue.

Instead of people being forced to track time so they can schedule their lives, their personal assistants will track time for them. In such a world people will more likely focus on living in the present than caring what the time of day might be. And time keeping itself may even be seen as antiquated. After all, your assistant will be there to worry about time, and the future, for you.

Now obviously such a world would disturb many people, including myself. But imagine how our hunter-gatherer ancestors would have felt in modern New York City or Beijing, where continually tracking the minutes and hours of each day determines whether you can keep a job and support yourself and your family. They'd have been appalled at such a life.

There might be some good in achieving a time-free future, at least for people like myself with a horrible sense of time. I often find the hours and minutes of each day slipping away from me. I’ll be focused on my writing or staring at a beautiful sunset and suddenly realize I forgot to meet my friends at the neighborhood bar.

But I also fear such a time-free world. Who knows how humans will change if we no longer have to actively track time on a personal level. Will we still plan for the future? Will people descend into an eternal hedonistic present where all that matters is moving from one scheduled moment of excitement to the next?

Even more troubling, perhaps employers will require their workers to use specially-approved scheduling inputs. In this scary version of Ellison’s "Repent, Harlequin,” your employer will be assured that your life and happiness never interfere with work-related duties because they’ll schedule your life so that never happens.

One of the things which make us human is our species’ use of tools. Our tools both define and shape who we are. And few tools are as subtly powerful as how we track and measure time.

None of these speculations mean humans will ever stop tracking and measuring time. Of course we won’t. Scientific and technological systems will always require accurate time measurement. But I suspect that on a personal level, humans will one day give up watching the clock for an even more controlling form of life scheduling.

So forget repenting for the Ticktockman. Because in such a future humans will be so intermeshed with life scheduling that time itself might appear to disappear.

Tick. Tock. Tick. Tock.

What happens to storytelling when the audience knows everything?

phonecastle.jpg

The opening of the classic science fiction conspiracy show The X-Files posed an evocative statement: “The Truth Is Out There.” Well, the truth is still out there and, thanks to the internet and mobile technology, humanity is well on its way to having continual access to any truth we want.

These days anyone with a mobile phone or tablet carries vastly more computing power than all the computers on Apollo 11, which landed on the moon in 1969. Add in online access to the ever-growing libraries and archives of human experience and it’s possible for many people to instantly learn the answer to nearly any question they have.

And we’re merely at the beginning of continual access to human information and knowledge. Wearable tech and its promise to embed online access into clothing and eyeglasses and more, is already appearing. Body tech will follow shortly after.

We’re already seeing major changes in society from people having access to information through mobile devices. Paper maps and guides, which existed for thousands of years, are nearly extinct in some countries as people use their phones and GPS to navigate. Printed encyclopedias and dictionaries have also mostly disappeared, replaced by Wikipedia and other online resources. And social movements like the Arab Spring owed much of their power to the instantaneous sending of information between people by social media.

Those are merely the start of the changes we’ll see when every human has instant access to any information they desire. And one intriguing question I’ve been pondering is what this continual access to information will do to storytelling.

Here’s the issue: the vast majority of stories deal with an information gap between that story’s characters. This gap between what is known and not known by different characters helps create a story’s drama.

For example, in Romeo and Juliet a main character commits suicide because he believes his lover is dead. But what happens to that story when the characters can instantly find out they’re both alive?

Or what about Liam Neeson’s film Taken, where a father hunts for the people who kidnapped his daughter? What happens to that story when the father can instantly know the address where his daughter is being kept? Or his daughter can access an online database to learn of her kidnapper’s true nature when she first meets him?

And what about the eternal horror story where a group of kids visit an isolated house containing a monster or killer? What does that story turn into when the characters not only know the monster is present but download the monster's profile and stay in continual touch with each other instead of splitting up to be killed?

Those are only three examples of stories where having instant access to information could kill much of the story’s drama and plot. And this doesn’t take into account certain literary genres like mysteries, which could face extinction if people expect anyone to solve a crime after researching for a few seconds on their smartphone.

All of this may sound like nitpicking, but it’s an issue today’s authors and story creators must address. Otherwise their stories will no longer be believable. Audiences accept and go along with stories because of suspension of disbelief — the term refers to how people accept fantastical aspects of stories as long as the story has a semblance of truth. And in a world where many audiences already have instant access to all the information they desire, not giving characters the same access will cause that audience to doubt other aspects of the story.

Authors are already dealing with this issue in their fiction. I’ve talked to many writers over the last decade who bemoan what cell phones have forced them to do with their stories. If a character in a story gets lost or needs to call for help, an author has to set up exactly why that character doesn’t pull out their phone and use Google Maps or contact the police.

Ever wonder why so many films and books these days are set in areas with bad cell phone coverage? Mystery solved.

Much of human history revolved around a scarcity of human knowledge. Our early hunter-gatherer ancestors lived in a world of danger and strangeness, governed by scientific rules they didn’t understand. Even when civilizations began inventing written languages and record keeping, most people didn’t have access to that information. Instead, storytelling and oral traditions and religion and gossip and rumor were essential for the sharing of needed knowledge.

But things have changed significantly over the last few centuries as schooling and literacy expanded, and as the printing press and then radio and TV gave more and more people access to needed information. One way to look at human history is that greater and greater percentages of humanity are continually being given the tools to access the knowledge and information they desire.

And now we’re on the cusp of every human having access to all information all the time.

Despite all this, ready access to information and knowledge is unlikely to totally destroy stories because the biggest source of human drama in storytelling won’t disappear. For many years to come you still won’t know what the person next to you is thinking or planning to do. The individual worlds and thoughts which swirl in each and every human will still be mysterious realms. Great stories will continue to explore the drama and conflict arising from that.

And just because humans will have 24/7 access to all information, that doesn’t mean they’ll desire the correct information. Or understand the information. Or act on it properly. All of which provides even more fuel for great stories.

In many ways science fiction and fantasy stories are able to deal with this issue far better than other storytelling genres. After all, if you set your story in an epic fantasy world without cell phones, or in a future universe or time where information is severely controlled, audiences won’t expect your characters to have continual access to needed information. Basically, the SF/F genre has long used world building to address why characters can’t use magic or technology to access any information they might need.

I wish I knew exactly how our information revolution will eventually change stories. But that’s exactly the information I’m unable to access at this time.

The Submissions Men Don't See

Lots of screaming in the SF/F genre lately about "data" suggesting far more women are being published in genre magazines than men. The problem with that analysis, though, was it only looked at a small group of magazines. Add in all the other professional SF magazines out there and the numbers change, making the controversy choke on a big mouthful of nothing pie.

Don't believe me? Check out this excellent examination of gender submission and publication statistics in the SF/F short fiction field, which Susan E. Connolly published in Clarkesworld in 2014. Her examination spanned multiple articles and is incredibly detailed with a strong data set. Her conclusion? "Authors who are women are less well represented in terms of submissions and publications than authors who are men." 

However, there was an interesting item to come out of the recent controversy. In an interview with Neil Clarke about Clarkesworld's submissions, Jon Del Arroz quoted the editor as saying "men were also slightly more likely to submit multiple stories per month." After talking with Neil about the magazine's overall submission and publication track record, Jon added:

"What I take from this, despite his not analyzing the breakdown of why stories fail through submissions by gender, is that men are more prone to submit stories which don’t fit with Clarkesworld’s style of science fiction, and submit them anyway just hoping they make it in a crap shoot."

There's truth for the SF/F genre: Men are indeed much more likely to take a "crap shoot" attitude toward submissions than women, with male writers being far much more likely than women to not read submission guidelines and to submit inappropriate stories which don't fit what a magazine publishes.

One reason I reacted so strongly to Jon's words is they match what I saw when I edited storySouth. Men would spray submissions at my magazine as if marking their territory, assuming their brilliance would overcome not reading our magazine or following our guidelines. Yes, a few female authors also did this, but the numbers were really skewed toward men.

While I haven't edited storySouth in many years, it appears this trend is still going strong. For example, a professional genre magazine whose editor I know revealed their September submission stats for this essay. The stats provided didn't include author names or any submission information aside from demographics and if the editor felt the submission was appropriate/inappropriate for their magazine and/or didn't follow guidelines.

So far this month this magazine has received 182 submissions, with 54 of them being by female authors (matching the analysis linked above which said far more men submit genre stories than women). Of these 182 submissions, the editor felt that 11 were totally inappropriate. Of those 11 submissions, ten came from men and only one from a woman.

By inappropriate submissions, the editor is very specific and means a story which by no stretch of the imagination would fit with what they publish, meaning the author didn't read their magazine or their guidelines. Some of these submissions also didn't follow standard manuscript format, although the editor said their magazine is generous on SMF and they only get irritated when authors deviate massively from it.

This editor also added that those 182 submission included five male authors who submitted a total of 11 stories. Only one woman submitted more than a single submission in the same time frame.

Again, this matches what I saw years ago with storySouth. And while stats about this are difficult to find, editors I've spoken to over the years have told me a similar pattern exists in many publications, even outside the SF/F genre.

When I consider why this happens, I keep coming back to the famous story "The Women Men Don't See" by James Tiptree, Jr., where the male narrator can't see women as real people and so misses the truth of what's going on around him. As Triptree said of her story, its message is the "total misunderstanding of women's motivations by narrator, who relates everything to self."

In the case of why male authors are far more likely to not read a magazine or their guidelines before submitting, and are more likely to submit multiple stories in a short time frame, I think it ties in with them not seeing the motivations of others and believing all that matters is what they want. 

But if you're submitting your stories to an editor, what you want isn't what lands the acceptance. It's what the editor wants. Otherwise, an author is merely wasting everyone's time.

Autonomous by Annalee Newitz is SF punk for a new generation

Autonomous.jpg

Ever since the publication of William Gibson's 1984 novel Neuromancer, which helped jumpstart the cyberpunk subgenre, there's been a tendency to "punk" each new exciting science fiction trend or book. Biopunk, steampunk, nanopunk, bugpunk — the punk designation is as much tied in with the attitudes represented by these subgenres as it does with the stories' subject matter and new takes on traditional SF themes.

One of the best debut novels I've read this year is Autonomous by Annalee Newitz. The story, which focuses on a future in which biotech and artificial intelligence and corporate control of patents rule all our lives, is begging for a SF punk label. There are more exciting ideas and possibilities in Newitz's novel than in an entire year's worth of works released by more traditional SF publishers. The story is also fast paced with interesting characters ranging from the traditionally human to AIs enslaved in war-fighting bodies.

William Gibson called the novel "genuinely and thrillingly new," which is extremely accurate. The world created in Autonomous is so interesting and unique that I could see this novel inspiring a new subgenre. Maybe AI-punk, or a reworking of what biopunk is currently about. Either way, Autonomous is an excellent new science fiction novel which fans of the genre's "literature of ideas" will love and will be on my short lists for next year's award nominations. Check it out.

 

Measuring the slow Hugo Award death of the rabid puppies

Congrats to this year's Hugo Award winners. Plenty of great works won Hugos at the 75th Worldcon in Helsinki, including a Best Novel for The Obelisk Gate by N. K. Jemisin. This means the first two novels in Jemisin's Broken Earth trilogy have won the Hugo Award. I'm really looking forward to reading the final book in the trilogy, The Stone Sky, which comes out in a few days. The Broken Earth trilogy is now one of the most acclaimed and honored series in SF/F history, so if you haven't read the novels do so.

It was also exciting to note the crossover between this year's Hugo and Nebula Awards, with the novella "Every Heart a Doorway by Seanan McGuire and the short story "Seasons of Glass and Iron" by Amal El-Mohtar winning both awards.

One of the most interesting aspects of this year's Hugos was to see how the new voting rules revealed the overall weakness of the rabid puppies slate. Under this year's Hugo rules, designed to reduce the impact of bloc and slate voting, people were able to nominate up to 5 works or people in each category. However, the top 6 works or people in each category become finalists, ensuring slate voting can't stuff all slots on the final Hugo ballot.

In addition, nomination votes were tallied by both the total number of nominations received and by points, with a single point assigned to each individual voter’s nomination ballot. That means if you nominated works in all 5 slots within a category, each of those nominations would receive 1/5 of a point. If you nominated only a single work in a category, that nomination would receive a full point.

Because of these new rules Vox Day, who organizes the pup slate each year, urged his followers to only make a single nomination in most categories, ensuring their slate would receive the maximum number of points. While this strategy placed a single one of their slate in many of the categories, it also revealed exactly how small their movement is.

For example, in the nomination tally released last week by Worldcon (PDF download), eventual Best Novel winner The Obelisk Gate received 480 ballots with a final points tally of 295.97 (out of 2078 total ballots cast for 652 nominees). Most nominees on the nomination tally received similar ballot to point spreads, indicating the people nominating those works were also nominating 2 or 3 or more works in each category. Since their points were spread across multiple works in each category, the points for most nominees were far less than the number of ballots those nominees received.

Not so with the pup slate. For example, in the Best Novelette category the pups' joke nomination "Alien Stripper Boned From Behind By the T-Rex” received 77 ballots and 76.50 final points, meaning almost every person who nominated that "story" didn't nominate anything else in that category. 

In the Best Short Story category, "An Unimaginable Light" by John C. Wright received 87 ballots and 87 final points out of 1275 ballots cast, suggesting no one outside of the pup slate nominated his story. In the Best Editor (Long Form) category, Vox Day received 83 ballots with 83 final points out of 752 total ballots cast. As with Wright, this suggests no one outside of VD's slate nominated him.

These numbers back up previous estimates of the weakness of the rabid puppies and give more evidence that at most 80 to 90 Hugo voters support Vox Day's ballot stuffing. These are extremely small numbers compared to the more than 2,000 people who cast nominating ballots this year, or the 3,319 people who voted during the final Hugo ballot.

The reason the rabid puppies were able to cause so much trouble with the Hugo Awards in recent years is because the awards were easily gamed by a small group of slate voters. Only cultural constraints within fandom prevented this from happening previous to the rabid puppies.

The results of this year's Hugo voting shows that making an award resistant to slate voting is a must in today's genre.

Perhaps the Dragon Awards, a new SF/F award now being ravaged by slate voting from the pups, will learn from the Hugo experience. Or perhaps not.