Wednesday, 11 July 2018

Is the password system broken?

For our latest look at the topic of digital citizenship, Susan Halfpenny must use at least one lower case character, upper case character, number and special character.

Padlocks on a rail

Large data breaches in recent years have led to millions of accounts being hacked and personal information being shared (take a look at World’s Biggest Data Breaches for a visual representation): the Yahoo! hack in 2013 resulted in more than one billion user account credentials being stolen.

Often, compromised security and theft of username and password information can lead to more than just one of your online accounts being compromised. Matt Honan has written at length about his experience of being “epically hacked”, where in the space of an hour his Google account was deleted, his Twitter account taken over and his AppleID account broken into, resulting in the data being deleted from his iPhone, iPad and MacBook.

Hackers will often exploit weaknesses in security systems to access information. For example, in the iCloud leak of celebrity photos in 2014, hackers may have taken advantage of a flaw in the application interface which permitted unlimited attempts to guess passwords. Could companies do more, then, to protect our information?

Encryption and adding layers of security to applications can obviously help, but the major flaws undermining everything else are the limitations of human memory, our collective lack of understanding regarding what factors make a password secure, and our lack of patience. More often than not, though, we will give our information away through phishing emails and poor personal information security like using the same weak password for every account. We might try to come up with more, but, in our modern, busy lives, who of us can remember a hundred and one different and adequately complex passwords?

Even those of us who should or do have a high level of awareness and understanding of information security will still fall prey to laziness. I’m currently trying to use two-step authentication to keep my accounts more secure, but I hate it when I have gone to deliver a workshop and then realise I have forgotten to pick up my phone from my desk, so I then need to head back to the office to collect it in order that I can receive the text message containing the additional one-use code that I need to employ to access my account. It’s times like this that there is a very compelling temptation to switch the two-step authentication off!

The current password system relies too much on our memory and our patience; and on the everyday person who isn’t trained to think about information security all day. We might therefore say that the current passwords system is broken.

So how are hackers exploiting security flaws and human errors?

You may be surprised to hear that hackers aren’t necessary using complicated coding to hack into account. Yes, sometimes large scale attack will take place using programs to attack security flaws, but often passwords can be guessed through social engineering: using the information you share online. For some stark examples, take a look at this article by Kevin Roose where he exploits the digital literacies of hackers to highlight security risks.

Norton collated some useful information about the different ways that hackers hack into your passwords, summarised below:

  • Social engineering: the use of information lifted from your social media to gather answers to your security questions… things like the school you went to, your pet’s name, when you got married, when it’s your birthday, your favourite band… Hackers can gain access to all this information and use it to answer your security questions and guess your passwords.

  • Dictionary attacks: using programs that cycle through a predetermined list of common words often used in passwords. If you are using Password1 as the password for your account then what did you think was going to happen?! To better protect your accounts from dictionary attacks, avoid using common words and phrases in your passwords, or avoid recognisable words altogether.

  • Password crackers: programs used to crack passwords by brute force, repeatedly trying millions of combinations of characters, until your password is detected. Shorter and less complex passwords are quicker to guess for the program. Longer, more complicated passwords take exponentially longer to guess, so the longer and weirder the better!

But if we’re creating lengthy and complex passwords, how can we hope to remember them? Mnemonics can only get us so far. We could potentially use some form of encrypted password management software, but vulnerabilities apply there too: guessing one password may give the hacker access to all of your passwords! Still, it should be more secure than using the same password(s) for everything, because there’s only a single point of failure (the password manager) rather than multiple points of failure (every account you own). Whatever method you choose to use, a set of complicated but securely stored passwords should be far more secure than several easily memorable passwords, if only because they’ll be considerably less guessable.

For more help and advice, take a look at the IT Services tips for choosing a strong password, and test yourself in our information security myths quiz.

Wednesday, 27 June 2018

On the internet, nobody knows you're a dog (unless you tell them)

Our series of explorations into what it means to be a digital citizen continues with Stephanie Jesper pretending to be a dog...

A cat hiding behind a pole

As Peter Steiner’s 1993 cartoon for the New Yorker put it: “On the Internet, nobody knows you’re a dog.”

The internet only knows what you tell it. And what you might want to reveal may vary according to what it is that you want to do. There is a long tradition on internet forums and bulletin boards of using a pseudonymous screen-name or handle. In a large part this was a mechanism to permit discussion of ‘sensitive’ subjects: an alias is a very simple way of distancing your online profile from your off-line one, be it for social, professional, or even legal reasons. But choosing an amusing or clever name can also serve as a fun means of expressing a persona. What is more, pseudonymic screen names can facilitate objectivity in a discussion: social factors such as gender, age, location, education and race may be obscured (partially or entirely), reducing the impact of preconceived biases. A screen name can also allow a user to experiment with or hone their identity (for example in the trans community), and may give confidence to those who might, under their real name, feel socially awkward for whatever reason. This confidence boost can be double-edged, of course: hiding behind a screen name may give you courage to express yourself and your opinions and to explore areas of society and culture that you may otherwise have been too afraid to examine (be it a question of taboo, reputational risk, a fear of failure, or some other impediment), but it can also give you the courage to test the limits of your powers, to be abusive and to threaten other users without fear of recourse. At its most pathetic, this is manifest in Wikipedia vandalism and childishly disruptive behaviour in internet forums; at the other extreme lies persistent trolling, bullying, and even death-threats.

By using an online pseudonym, we make it intentionally difficult for people to connect our online activities to our real-world persona, which is fine unless we actually want that association. We may be looking to promote ourselves, and to connect with people we know, used to know, or want to know in real life, in which case a pseudonym is probably going to get in the way. This is why Facebook and LinkedIn operate real name policies: they’re geared around people finding other people. The problem with being findable, however, is that you can’t especially control who can find you. Having a potential employer find your LinkedIn profile might be a positive thing (assuming it’s an attractive profile); having them find your Facebook profile might be less positive, depending on what you’ve got on there and how locked down it is.

There’s a tradeoff to be had between self-promotion and freedom of expression, and many approaches to take. You could lead a completely uncontroversial life, online and off, and have the tenacity and resilience to be able to cope with any unwanted intrusion. You could live entirely under the cloak of anonymity, but then you may find that you’ve relinquished control of the top search results for your real name, which may not necessarily be a favourable state of affairs. A better solution is to conduct your social activity under one name, and your professional activity under another: some people, especially on Twitter, make use of two accounts – one professional and one social – and Twitter’s own mobile apps support switching between multiple accounts. But in many professions the social use may actually prove a professional advantage, and separating the two can be both a difficult and a false dichotomy to make.

The information trail we leave online isn’t just a reputational concern. We can give away a lot of personal details, and while for the most part this will be just noise in the internet, it is information that can be used against us.

The TV series Hunted provides an effective (and indeed entertaining) illustration of how our online activity can betray our movements, our intentions and our personal networks. In some cases, confiscated devices, phishing attempts and hacked passwords are used as a means of gaining sensitive information, but all too regularly the clues hide in plain sight: on open social media accounts that any of us can see.

If you’re posting in an open forum, anybody can access that information. Tweeting something like…

Holiday! Just hope my new bike can bear 2 weeks without me, languishing in the backyard of 12A The Grove, Chepstow. Forgot to chain it. Oops

…is obviously a bad idea. But communicating even snippets of such information has risks (as we explore on our Subject Guides) because snippets can build up into a larger picture about you and your circumstances.

It isn’t just what we post that poses a potential risk. Our accounts themselves may be sharing more than we might think, as the Cambridge Analytica scandal has demonstrated. If you’ve ever seen your Facebook profile picture staring back at you from the comments section of a blog post, inviting you to participate, or if you’ve seen adverts targeting your interests, you’ll have an idea of the kind of thing that can get passed around. It’s a good idea to go through your social media security settings with a fine-toothed comb every now and again, to lock down as much as you’re able, but inevitably there is a tradeoff between security and functionality. As with so much, it’s a case of striking a balance and being aware of the risks involved.

Wednesday, 13 June 2018

The need to know

In our second of a series of explorations on what it means to be a digital citizen, Stephanie Jesper and Alison Kaye assert their inalienable right to WiFi.

Maslow's hierarchy of needs, underpinned by the need for WiFi
Maslow's Hierarchy of Needs (revised)

For many of us, internet access has become ubiquitous. As the meme above illustrates, over the course of a single generation we have become profoundly reliant upon our connection to the net.

It’s hard to imagine how those of us who were alive in the early 1990s managed to cope without the world’s knowledge at our fingertips everywhere we went. Arguments over matters of trivia would last for days until Wikipedia became but a few thumb-swipes away. If you’ve ever been to a conference with inadequate WiFi, or taken a holiday in the middle of nowhere, with no network access, you’ve a flavour of what it must be like to live in information poverty.

We're being flippant, of course, but with so much of modern life being online, including job applications and government paperwork, those of us who are not online are at quite a considerable disadvantage. Over half of the world’s population (about 52% as of the end of 2017) do not have an internet connection. Even in the UK, the figure is about 7% — c.5m people (that's more people than watch Gogglebox). These people lack what for many of us has become a basic necessity.

This is why countries such as Costa Rica, Finland, France and Greece have enshrined some form of internet access rights in law, and why in 2011 the UN Special Rapporteur recommended that:

Given that the Internet has become an indispensable tool for realizing a range of human rights, combating inequality, and accelerating development and human progress, ensuring universal access to the Internet should be a priority for all States.

A snail bridges a gap
Bridging the skills gap

The skills gap

But even for those of us who can get online, we still need the skills required to effectively engage in the modern workforce and our digital society. In 2015 the UK Select Committee on Digital Skills, appointed by the House of Lords “to consider and report on information and communications technology, competitiveness and skills in the United Kingdom”, raised alarm bells in their Make or break report. They referred to work by the UK Forum for Computing Education (UKForCE) into the skills required for different occupations. UKForCE outline 4 categories of skill levels required for the population of the labour market:

Digital muggle

“… no digital skills required—digital technology may as well be magic”.

Digital citizen

“… the ability to use digital technology purposefully and confidently to communicate, find information and purchase goods/services”.

Digital worker

“… at the higher end, the ability to evaluate, configure and use complex digital systems. Elementary programming skills such as scripting are often required for these tasks”.

Digital maker

“… skills sufficient to build digital technology (typically software development)”.

They used this framework to analyse the 361 Standard Occupation Codes, a common classification system used to map all occupations in the UK according to their skill level and skill content, to show the following:

Percentage of the UK workforce in each category

Digital muggle: 2.2m (7%); Digital citizen: 10.8m (37%); Digital worker: 13.6m (46%); Digital maker: 2.9m (10%)

According to these figures, 93% of UK jobs require at least some digital skills — skills that 12 million of us in the UK lack. And with automation estimated to threaten 35% of UK jobs, the need for digital skills becomes all the greater.

Libraries can have a role in bridging this skills gap, offering access to digital technologies, fostering the literacies required to navigate the world of digital information, and thereby enabling digital citizenship and participation in digital society (Explore York, for instance, have drop-ins for tablets and e-readers, one-to-one sessions on computer basics, and an introductory course on using the internet).

And here at the University of York there've been a number of projects to develop digital skills for both students and staff. Library & IT staff have been working with departments to incorporate digital literacy across all courses, and we’ve also put together a new programme of digital skills training sessions. Alongside all of this there's our online Skills Guides which are open for anybody to access and use, and we're currently working on an IT Essentials site to help people escape their muggledom and exercise some digital wizardry.

Monday, 11 June 2018

A walk in the Park? Jurassic Park at 25 and the enduring popularity of dinosaurs.

The 11th June 2018 marks the 25th anniversary of the release of the movie Jurassic Park. Based on the 1990 novel by Michael Crichton, Jurassic Park is a blockbuster which has stood the test of time and spawned an ongoing franchise of movies and games. The film won three Academy Awards, two BAFTA awards and key scenes remain staples in lists of favourite film moments. Released in 1993, Jurassic Park set a new standard of animatronics, bringing dinosaurs to the life as never before and sparking renewed popular interest in the prehistoric.
But although Jurassic Park (and the subsequent sequels) may have been the most successful appearance of dinosaurs in cinema, this certainly wasn’t the first time dinosaurs had captured the popular imagination, nor even their first appearance on the big screen. The growth of the discipline of geology in the early nineteenth century saw new developments in general understanding about the age of the earth and the existence of now extinct species, in turn developing the discipline of palaeontology. The identification of different geological layers helped date discoveries of pioneers like Mary Anning, who discovered numerous important Jurassic specimens along the Lyme Regis coastline, from 1811 onwards. Geological societies helped establish collections which would in turn form the basis of museum departments.

Still on display, the Crystal Palace dinosaurs
are now a Grade 1 listed landmark. (Image from Pixabay)
Popular interest was fuelled by increasing public access to museums and projects like the Crystal palace dinosaurs, unveiled to the public in 1854 and the first life size replicas of extinct animals. Artist Benjamin Waterhouse Hawkins produced vast sculptures representing these prehistoric creatures. Although inaccurate by modern understanding, this was a serious attempt to represent the creatures properly, with advice taken from the eminent palaeontologist Sir Richard Owen. More fanciful were representations in the emerging literary genre of science fiction, perhaps the most famous example being Sir Arthur Conan Doyle’s novel The Lost World, first published as a serial in 1912.
The Lost World establishes an archetype of fictional dinosaurs - plots centred on remote places where dinosaurs and other extinct creatures have miraculously survived. Conan Doyle creates an isolated jungle plateau, but hidden valleys and forgotten islands are often used for the same effect. The alternative thread of dinosaur fiction emerging through the twentieth century, of which Jurassic Park is part, saw not the discovery of hidden dinosaurs, but rather their scientific resurrection or recreation. As greater understanding of dinosaurs and of the possibilities of genetic engineering grew, so did ideas about the capability of somehow recreating extinct creatures. Dinosaurs could play a part in stories about the limits and consequences of unchecked scientific experimentation.

This is not to say that all late twentieth century approaches to the topic were so thoughtful. Whilst Jurassic Park offered bioethical questions in the midst of an action blockbuster, other 1990s film efforts were not so successful. Released only a year after Jurassic Park in 1994, Dinosaur Island is on a distinctly lower budget, feels far older and was widely panned. The film uses a variant on The Lost World model - US airmen crash on a remote island, inhabited solely by scantily clad women and the prehistoric creatures who menace them. The plot, film quality and attempts at special effects have all aged badly, with the film arguably little more an excuse for extensive on screen female nudity.



But neither are plots based on scientific resurrection of dinosaurs guaranteed success. Even allowing that any scenario involving the recreation of dinosaurs would be far-fetched, the 1993 film Carnosaur tries to raise issues about scientific development and future dystopia which are largely lost in a mind-boggling improbable storyline in which a lethal airborne virus impregnates women with genetically mutated dinosaurs. It was not critically well received and nor were the following two sequels; the second of which has the genetically altered dinosaurs as an experimental weapon, the third was straight to video.



Much of the success of Jurassic Park is due to the apparent realism of the dinosaurs. Although many earlier portrayals of dinosaurs on film took little interest in the accurate portrayal of their extinct subjects, the twentieth century saw increasing efforts to portray prehistoric creatures with greater scientific precision, even in fiction. One of the earlier examples of this more factually based dinosaur entertainment can be found in Disney’s 1940 feature length film, Fantasia. The animated section accompanying The Rite of Spring shows prehistoric life, culminating in epic sequences depicting the dinosaurs and their demise. Although an animation created for popular entertainment, efforts were made to maintain some level of scientific accuracy, with the studio taking expert advice from authorities including the director of the American Museum of Natural History, the biologist Julian Huxley and the palaeontologist Barnum Brown.



Fantasia gives probably the least anthropomorphised depiction of dinosaurs from the animated depictions. Jurassic Park was not Spielberg’s first foray into dinosaurs on film - he was the producer of the animated 1988 film The Land Before Time. Reminiscent in animation style of the Fantasia sequence, this is another story set in the age of dinosaurs, without any human interference. Although the basic dinosaur behaviour is broadly accurate, the dinosaurs are named and anthropomorphised - given voices an inter-species gang of young dinosaurs face peril and adventure as the plot unfolds. The project had originally intended the dinosaurs to be portrayed more “naturally”, without dialogue, but the decision was taken to create the protagonists as characters with voices. (Whilst a very different approach to dinosaurs on film, like Jurassic Park, The Land Before Time proved popular and also gave rise to numerous sequels).

Not technically a dinosaur, but a Jurassic creature,
CGI enabled portrayal of pterosaurs on film. (Image from Pixabay)

The incredible developments in Computer Generated Images or CGI in more recent years have seen changes in the way dinosaurs are brought to the screen. Without the total reliance on models and animatronics, new creative possibilities are opened up, such as the depiction of flying creatures such as pterosaurs. The blending of animatronics and CGI - as was done in Jurassic Park and various subsequent representations- allows close interaction between human and dinosaur actors, as well as giving the potential for use of different environments and the creation of whole herds of dinosaurs - the sheer numbers of which would have been both vastly expensive and impractical were a production to rely solely on models. The innovative blend of live action, models and CGI helped win the film multiple awards for special effects.



Unusually, fact followed fiction in taking this blended approach. The 1999 BBC documentary series Walking with Dinosaurs created a series of nature documentaries, following prehistoric individuals in the same format as a wildlife documentary about a living creature. This series also used a combination of animatronics and CGI, with real locations providing the natural backdrops to the action. Although a factual series, it was inspired by the public interest created by Jurassic Park and used many of the same techniques in portraying dinosaurs on screen.

A Turkey sized terror (Image from pixabay)
So how realistic are the dinosaurs in Jurassic Park? In any scientific field, new knowledge comes to light and changes past understanding, so portrayals will change. Palaeontologists could take umbrage with the film’s title. Although the brachiosaurus is a Jurassic era dinosaur, many of those featured in the film are not. The triceratops is from the cretaceous era, as is tyrannosaurus rex. Velociraptors are also cretaceous and although widely accepted that they did hunt in packs, in reality these carnivorous dinosaurs were considerably smaller than in the films. Instead of standing over 7 feet tall, as in the movie, these predators were actually around the size of a large domestic turkey. Additionally, it is also thought they may have been feathered. Although the similarities between birds and dinosaurs is mentioned in the film, most notably with the ostrich like gallimimus, feathers are not introduced. It is easy to see why filmmakers were tempted to introduce the iconic triceratops and tyrannosaurus rex to the dinosaurs featured in the park, but the Jurassic era also had well known species. Pterosaurs were common, allosaurus and megalosaurus were major predators, stegosaurus perhaps the most recognisable dinosaur of the period (these species do start to make appearances in later films from the franchise).

A detail of Dippy, a long time favourite at the
Natural History Museum of London, now on tour
around the UK. (Image by dronepicr, via wikimedia commons)
Twenty five years on, Jurassic Park has aged well and still stands up as an action film. The ongoing series of sequels suggests a continuing appetite for prehistoric creatures on film, whilst dinosaurs remain a perennial favourite in any natural history museum. The crowd pleasing quality of dinosaurs is attested by Natural History Museum of London’s decision to host a mini-site The Dino Directory and the popularity of events and exhibitions, such as Dippy the Dinosaur, the star of a national touring exhibition, and Yorkshire’s Jurassic World, recently opened at the Yorkshire Museum. So what is the enduring appeal? Perhaps the dramatic function is that dinosaur fiction is both scary and safe. These are powerful creatures about which we know little allow for a thrill of fear-these creatures could have crushed or eaten you - but the reality of their extinction renders the same - they pose no threat to the modern audience.

The dinosaur perhaps also reminds us of the changeability of our planet and human transience upon it. With increasing awareness of climate change and loss of biodiversity, focus on extinct species can add poignancy. Narratives using the “Lost World” device of the hidden prehistoric habitat can give rise to stories questioning human impact on the natural world and habitat destruction, whilst those plots resurrecting the extinct through science allow for the exploration of ethical questions about genetic science and the human exploitation of other species. In the original Jurassic Park film, the emphasis on chaos theory and the violent failure of the park offers a message that commercialised science is exploitative and potentially dangerous. And there lies the irony. As Megan Stern notes in her article Jurassic Park and the moveable feast of science, Michael Crichton and Steven Spielberg succeed where the film’s John Hammond fails. Although the narrative depicts disaster for the resurrection of dinosaurs for commercial exploitation, it is a feat which Jurassic Park accomplished with great success.

Find Megan Stern’s article Jurassic Park and the moveable feast of science, in the journal Science as Culture, Volume 13, issue 3, 2004. University member? Log into Yorsearch and get access to Science as Culture online.

Inspired to re-watch the movie? Find the DVD of Jurassic Park at shelf-mark LP 4.30973 SPI in our audio-visual collection.


Read Conan Doyle's classic adventure The Lost World, available in the J B Morrell Library at shelf-mark MA 173.9 DOY.

Yorkshire’s Jurassic World is curated by York Museums Trust and is open at the Yorkshire Museum.


Wednesday, 30 May 2018

The connected world

In our first of a series of explorations on what it means to be a digital citizen, Stephanie Jesper wires up this blog to her web-enabled toaster and sees what pops out.

Paperclips

Ask a futurologist what our digital future holds for us, and they’ll come up with some pretty mad stuff…

There’s “augmented reality” (AR), for instance: where our view of the world is digitally enhanced in real time. We perhaps may scoff at people going around in head-up display smartglasses like the shortlived Google Glass, yet the craze of the summer of 2016 was the smartphone AR game Pok√©mon Go. AR technology is still largely in its infancy, but it is already finding speculative applications in a range of areas, including medicine and industry, and it's been used to particular effect in the museums sector. Whether this technology will lead us into a utopia or a dystopia is unclear. Maybe this video project by Keiichi Matsuda will help you to judge!

Another thing that futurologists love is the idea of the “internet of things”: a myriad of connected devices beyond what we usually think of as networked technology. For years we’ve been told how smart fridges will reorder our food as we eat it, and the usual response is something along the lines of “well that’s fine if I want to keep eating exactly the same things week in week out”. But more and more of us are hooking up our televisions to the router and watching Netflix and Amazon Video through them. We might even change the channels using a remote control app we’ve downloaded to our phone. We’ve got wearable technology like Fitbits on our wrist to keep track of our health and fitness. We might have an app to remotely control our central heating or even turn the oven on while we’re commuting home. We may have set up a baby monitor linked to a tablet, or have a doorbell we can answer from work. And for those of us who are skeptical of the refrigerator ordering our shopping, there are Amazon Dash buttons we can press when we do need to stock up. All these linked technologies are steadily creeping into our homes, hooking onto our WiFi, and hopefully making our lives that bit easier.

This is not without risk: not that our internet-connected food processor is likely to gain sentience and go rogue… But all these devices have computers inside them, and computers are susceptible to hacking and malware. In October 2016, a distributed denial of service (DDoS) attack described at the time as the largest on record brought down a range of sites including CNN, Netflix, the Guardian, Reddit and Twitter. The attack was facilitated using Mirai malware that had found its way onto a range of devices including Internet of Things appliances (in particular, digital video records and web-enabled cameras). Manufacturers urged owners to change passwords on such devices from their factory-set defaults in a bid to improve their security, but this is not always straightforward (and in some cases may even be impossible). Until new security methods can be devised, our devices remain vulnerable to further attacks of this nature. With something like a webcam, the thought of being hacked becomes especially alarming, even if the chances of someone actually being interested enough to hack into your camera and watch you are probably pretty slim. What’s more, the United States director of national intelligence, James Clapper, stated in 2016 that:

In the future, intelligence services might use the [internet of things] for identification, surveillance, monitoring, location tracking, and targeting for recruitment, or to gain access to networks or user credentials…

Technology companies may already even be doing that. With internet-connected devices, it's not just Alexa who's listening in. Something like your television can be gathering viewing habits or listening to your conversations, picking up lots of juicy (potentially salable) information about you. It’s easy to become paranoid when faced with things like this. The important thing, though, is to stay informed, and to understand the risks: in pretty much all aspects of digital technology, we’re faced with having to balance benefits against risks. Only through awareness of those risks can we make an informed choice, and/or press for legislation and greater security, lest we accidentally find ourselves inviting an Orwellian surveillance state into our living rooms.