lifting shop

When I was in middle school I got good grades. My lowest grade was a B+ in 8th grade algebra, an advanced course for kids who would go on to study geometry as freshmen in high school. I know I struggled to understand the ideas in that class. The concepts were foreign to me: using variables to get answers made no sense. Factoring was impossible, and my brain did not catch on until I was older how to compute polynomial equations. I still do not know how to solve them. I have weaknesses, out of my own cognitive limitations or due to the fact that I never took the time to learn the lessons. But one thing I do know that I had learned at that young age what cheating was.

In 8th grade life science class, I sat next to a kid who was a trouble maker. The teacher had thrown his desk across the room one day after he recited one too many Jerky Boy quotes as clever retorts to questions like: what are the seven classifications of living things? I never blamed the kid. He was cool. He had to own that. Taking his position on what he must have seen himself as did not make him immune to the rules, but simply an afterthought. His young memoir might have read: The Indirect Consequence of Bart Simpson Economics and the High Cost of Trying to be Him. This might be a quarterly review of bad kid behavior that would be published as a sort of poster child magazine for what not to do, or for what to do when you want to get into trouble. In magazine form it might have amassed an enormous following in the 1990’s. Today it would be a highly acclaimed blog written by the newest generation of pranksters.

While taking a test sitting next to this kid I got the sense that he was looking over at my paper. Glancing out of the corner of my eye through my hair I felt like he was planting his eyes where they did not belong. The hair on my neck felt weird. I froze. In an automatic response, I covered my paper. I curled my arm around my test, cradling it in a way saying, “Don’t worry, baby. I won’t let anything happen to you.” I’m not sure if I went into protection mode because he was actually looking over at my paper, or not. Maybe what I thought about him and what he must have felt about himself prompted me to take proactive action, but that’s not the point. I didn’t have to be taught how to prevent cheating or what cheating was back then.

This covering of the test was a sort of universal sign that had been instilled at a very early age. Thanks to teacher instruction, I knew at the age of 13 that cheating was wrong and to prevent it at all costs. Teachers often pace up and down the length of student rows during tests to discourage looking up. Mr. Lynch was my sixth grade English teacher. Before the vocabulary tests, he would say, “Keep your eyes on your fries or it will spell your demise.” Thanks to the culture of academic learning, most classroom management styles include placing student desks far enough away from each other so as to prevent cheating. A teacher must be present in the classroom while students take the test. These are not simply tenants to live by, but rather the environment in which students learn acceptable norms of study and instruction.

I do not feel ashamed that I covered my paper, although that kid accused me of “thinking that he was cheating off me.” I was supposed to feel embarrassed, paranoid, weird, for my reaction, but I never did and I guess I probably never will. The one thing that I realize now is that there are ways of instilling ideas that will stick with you your whole life. As one grows older, the ways in which one can cheat change from test taking to more intricate forms of plagiarism. The preventative measures morph from covering the test paper to legal protection, trademark, and patent law.

Intellectual property includes inventions: literary and artistic works; symbols, names, and images used in commerce (WIPO). There are two categories of intellectual property: industrial property, which includes patents for inventions, trademarks, industrial designs, and geographical indications, and copyright, which covers literary works, films, music, artistic works, and architectural design (WIPO).

Intellectual property rights allow the creator of intellectual property to make profit without harm from those who would steal their property. The protections of intellectual property sustain things like the film industry, clothing lines, and technological gadgets such as the iPhone.

The idea of industry protection reminds me of the story of a lawsuit that was filed in federal court over the movie Out of the Furnace.

Seven of the 17 plaintiffs – all members of the Ramapough Lunaape Nation — use Degroat as their surname or middle name. In the movie, [Woody] Harrelson portrays Harlan Degroat, the leader of a violent criminal gang who lives in the mountains of New Jersey.

The film follows the film’s star, Christian Bale, as he tries to keep a younger brother played by Casey Affleck from the clutches of Degroat’s criminal gang.

“The community is depicted as lawless, drug addicted, impoverished and violent,” lawyers for the Ramapoughs wrote. (Zambito)

Lawyers for the film’s producers, Relativity Media and Appian Way, headed by actor Leonardo DiCaprio, warned that if the lawsuit went forward it would expose the film industry to legal challenges from those who disagreed with a movie’s content.

“Plaintiff’s lawsuit, if permitted to proceed beyond the pleading state, will chill free speech by subjecting creators and distributors of movies and other works of fiction to liability whenever some members of a distinct ethnic, cultural social or other definable group dislike how their group is presented,” attorney Mark Marino wrote in April (Zambito).

The real danger over these lawsuits is without intellectual property rights, the film industry would be vulnerable to attacks like these, resulting in a proliferation of its profits to those who were not creators, but rather, fraudulent profiteers of intellectual property.

In a similar case of patent law, money changes hands, this time due to patent violation. This case takes the instance of using someone else’s patented technology as a violation of patent law. The patent holder of 504 sued Mark Maron, comedian and host of WTF podcast (Chace). “Others who have been sued include Jesse Thorne, host of public radio show, Bullseye, ABC, CBS, and Adam Corolla” (Chace).

Jim Logan, a New Hampshire man, applied for his patent in 1996 (Chace). His idea was audio could be downloaded from the internet and listened to by consumers (Chace). Specifically the 504 patent “covers important technology related to automatically identifying and retrieving media files representing episodes and a series of those episodes becoming available. These patented techniques are commonly used in podcasting” (Chace).

In the 90’s Jim Logan started a company, Personal Audio, which tried to build podcasting technology. This technology ended up in the form of science magazines recorded onto cassette tape (Chace). Although the tapes didn’t last, his patent came in handy. In 2007, Logan sued Apple and won 8.5 million dollars for stealing his podcast patent (Chace). The suit was appealed and Apple settled for an undisclosed amount (Chace). The judge found that the iTunes application was an infringement of patent 504. Before the technology was available, iTunes violated the idea of a podcast, where a menu or array of episodes of audio are available to a consumer (Chace). Although the technology was not available at the time it was created in 1996, the patent was violated (Chace).

Jim Logan’s company didn’t create iTunes, and his patents would not have told you how to build them: where to put the processor, which lines of code to include in the program, but once the engineers at Apple figured it out, and Jim Logan came out of hiding and sued them (Chace). The thin film that wraps itself around patent law is a thinly veiled premise that an idea can then be wrapped around any technology that accomplishes the goal of the patent, years, even decades after the patent was created. This brings about conflict. When the every man, just doing his thing, can be prosecuted, extorted out of his own livelihood, there is a problem (Chace). President Obama even spoke out against this (Chace). The problem of patent law is an ongoing one, and it remains to be seen how it will be resolved, with regulation, more laws, or even more patents as band aids for the failed patent regulation of recent years. From cases of patent violation to plagiarism, the issue can be similar, but the outcomes are quite different.

Journalistic plagiarism is a representation of someone else’s ideas or direct quotes without proper attribution. Without having quotation marks around a passage or refraining from crediting the source, an author is committing plagiarism. The problem with plagiarism is that it happens everywhere, with author’s lesser known or those highly lauded in the international community.

Kendra Marr resigned from Politico on October 13, 2011, due to her inclusion of passages from other writers without proper attribution or quotation (Sporer). “[T]he articles drew from a range of sources without proper attribution, including reporting from the Washington, D.C.-based newspaper The Hill, the Associated Press (AP), and the Scripps-Howard News Service, the note said (Sporer).” On Nov. 10, 2011, Jim Romenesko resigned from The Poynter Institute following accusations of improper quote attribution, bringing an abrupt end to his 12-year tenure running the think tank’s media aggregation blog (Sporer).

In March of 2011 (Pexton) Pulitzer Prize-winning reporter Sari Horwitz copied segments of stories from The Arizona Republic “in whole or in part” without attribution (Sporer). The Washington Post suspended Sari Horwitz for three months for copying substantial portions of stories from The Arizona Republic “in whole or in part” without attribution (Sporer). Horwitz copied and pasted material from the Republic directly without attribution on two separate occasions in March during her coverage of the investigation of accused Arizona gunman Jared Lee Loughner (Sporer). George Orwell Prize-winning journalist Johann Hari directly copied and pasted a quote from a book in London’s The Independent (Sporer). “Hari took a four month unpaid leave” (Sporer).

The solution to plagiarism is simple. Writing for Chicago Magazine’s staff blog The 312, Whet Moser wrote in an October 14 post that referring to the original article with full attribution and advancing its reporting with original research was a writing format that presents “a simple, ethical solution” (Sporer). “Steve Myers of the journalism think tank The Poynter Institute said, ‘Just because someone doesn’t aim to malign doesn’t make his actions benign.’ [H]e proposed attribution as a solution to some types of plagiarism” (Sporer).

In a recent article in Plagiarism Today, Jonathan Bailey outlines the pressures of journalism in today’s high paced world. As a solution, Jonathan Bailey writes, “At some point, the only way to keep up with the demands of the job is take shortcuts. Plagiarism, fabrication and recycling are ethically dubious but effective ways to share minutes or hours off of production time” (Bailey). That plagiarism affords one an eloquent justification for prolonged employment seems ridiculous. The solution to plagiarism is to cite appropriately and credit the author. It doesn’t take that much time to insert quotation marks and parenthetical citation. The justification for plagiarism is seemingly endless.

Bailey goes on to compare the pace of the journalism profession to the pace of actual plagiarists. “At some point it is almost physically impossible for anyone to meet the writing demands without taking shortcuts. This is why essay mills, which have to turn around complex and lengthy research papers in as little as a day, have high rates of plagiarism themselves. When journalists have to churn out specialized text at the same rate as essay mill authors, ethics and training may not be enough to save them.” (Bailey)

Bailey says that the pace of the internet has made the expectations of online publishers too high to achieve without plagiarism. “Ever since the Internet became central to the way people consume content, there’s been a push by editors and publishers to create more and more content, to beat competitors by being the first online, having the most stories and iterating quickly” (Bailey). The ultimate fate of this ugly practice, however, is easy to poke holes in. “Unfortunately, due to how easy it is to detect plagiarism, it will likely be plagiarism that will serve as the early warning” (Bailey). Initially, I’m not sure if Bailey himself is in the right profession. If he cannot keep up with the high pace of writing, he should leave the writing profession to those who can. I reflect on what has failed Bailey, and understand how his story represents an even greater population of Baileys, those who plagiarize. Of this group, some plagiarize with this level of consciousness, and yet others often plagiarize out of ignorance. Taking a long-term view, the kids who sat next to Bailey in the classroom, were they Baileys, too? Aren’t we all just little Baileys, waiting to hatch? It’s like a virus in waiting, lying dormant for years, or something inherent in us all that inevitably clicks on. There is a way to cut down on plagiarism, and it starts in the classroom.

According to the American Psychology Association website, there are four strategies that prevent plagiarism in an academic setting. The first strategy is to create assignments that require more than summarization, but rather ask for specific questions that expect the student to read and understand the source material so that he can integrate it into the assignment (Prohaska). The second rule is for the teacher to explain the expectations and define plagiarism, including debriefing on the ease of plagiarism detection technology (Prohaska). The third guideline is to show students how to avoid plagiarism and to ask for students’ definition of plagiarism (Prohaska). This allows the teacher to monitor students’ opinions and helps clear up any ambiguous meanings, and promotes autonomous decision making for the student throughout the writing process (Prohaska). The fourth suggestion is to show students how to properly paraphrase quotes (Prohaska). “Students may assume that the likelihood of successfully plagiarizing is low when an instructor has devoted time and effort to teaching about it” (Prohaska). The efforts in academia set the foundation for ethical behavior in life, not only for the future writers and journalists but for everyone else who does something else with their careers.

Anyone in this country knows about cheating, that it does happen, and it probably happened that you’ve witnessed it or been a part of it in some way. This is a systemic problem in that any measure of success requires that you do well to avoid it. There are measurements for success and guidelines to adhere to, and that you’ve followed the rules implies that you will probably succeed at following the rules, but doesn’t that just mean that you have successfully avoided breaking the rules? Although there are rules, following them does not make you an automatic success. Students will perform in any number of ways. There are still students that are coming up in the system who have yet to be marked as plagiarists.

Although there are deterrents, there will always be behaviors that lie outside the acceptable norms of good behavior. It’s just our way. Failure happens. Smoking still looks cool. People still fail to wash their hands as often as they should. Even nurses don’t wash after touching germs. How everything is relative, there must always be a range of characters that comprise the human population. There are better and worse grades on a scale of A through F. There are good and bad apples. And the way through this world is not to ignore the problem children, but to teach them as well as they can be taught, reaching through the rungs as best as you can to get to them so that you can make an impact. This is true the same way that someone picks up an apple and says to it, “I can trust you as far as I can throw you.” This is also a way of measuring someone, how well they can score on a test, or how well they behaved in grade school, before the numbers really amounted to anything major. The importance here is that it’s still anyone’s game. Anyone can try to succeed and play by the rules, but it’s still cool to be like Bart Simpson.

References

Bailey, Jonathan. “The Looming Plagiarism Crisis.” Plagiarism Today. Plagiarism Today, 29 Jul.

  1. Web. 24 Feb. 2015. https://www.plagiarismtoday.com/2014/07/29/looming-plagiarism-crisis/.

Chace, Zoe. “How One Patent Could Take Down One Comedian.” npr.org. NPR, 5 June 2013.

Web. 1 March 2015. http://www.npr.org/player/v2/mediaPlayer.html?action=1&t=1&islist=false&id=188719954&m=188841801.

Council of Writing Program Administrators, National Council of Teachers of English, and

National Writing Project. Framework for Success in Postsecondary Writing. CWPA, NCTE, and NWP, 2011. PDF file.

Paxton, Patrick B. “The damage done by Post reporter Sari Horwitz’s plagiarism.” The

Washington Post. The Washington Post, 18 March 2011. Web. 1 March 2015. http://www.washingtonpost.com/opinions/the-damage-done-by-post-reporter-sari-horwitzs-plagiarism/2011/03/18/ABgtIIs_story.html.

Prohaska, PhD, Vincent. “Encouraging students’ ethical behavior.” American Psychology

Association. APA, May 2013. Web. March 8 2015. http://www.apa.org/ed/precollege/ptn/2013/05/ethical-behavior.aspx.

Sporer, Mikel J. “Attribution Controversies Prompt Reexamination of What Constitutes

Journalistic Plagiarism.” UNM. UNM Bulletin 17.1 (2011): n. pag. Web. 1 March 2015. http://silha.umn.edu/news/Fall2011/attributionplagiarism.html.

World Intellectual Property Organization. What is Intellectual Property? WIPO, n.d. PDF file.

http://www.wipo.int/edocs/pubdocs/en/intproperty/450/wipo_pub_450.pdf.

Zambito, Thomas. “Judge tosses out Ramapough Mountain Indians lawsuit over “Out of the

Furnace”.” nj.com nj.com, 15 May, 2014. Web. 1 March, 2015. http://www.nj.com/news/index.ssf/2014/05/judge_tosses_out_ramapough_mountain_indians_lawsuit_over_out_of_the_furnace.html.

Advertisement

Changing the headline

When I think about security, I wonder about how things are affected by it. Simply from the appearance of something feeling safe, one might do a number of things: take a walk at dusk; let a baby pet a pit bull; use a zip line over a thirty feet drop over rocks. At one time or another, someone decided these things were OK to do so they did them. Without the appearance of something being dangerous, someone still might reflect on those risks and brave the consequences. You don’t have to be brave to do them, but knowing what you do can result in injury is one component that someone may or may not consider before doing anything.

I know a girl who rode a zip line in Vermont who fell onto a pile of rocks. She suffered a concussion that changed her ability to process loud noises. For a year and a half after the fall, she could not read or use the computer for more than minutes at a time. She could not be in the same room as two other people who were speaking in regular volume. Indoor lights bothered her, so she wore sunglasses during the day. Before her accident, she was able to process sound, read books, and log onto Facebook without experiencing illness or needing to wear protective eyewear. She could run, jump, and yell and she was like a wild animal, but her injury changed her, if for a moment.

I feel like people know so much nowadays, about the dangers of things. The risks involved in anything are so great. We know how in an instant all we thought to be guaranteed might vanish or somehow slip away. This is not unlike how when we use the computer, we assume things will be secure. The simple click of a button affords us this. Now and again I realize how with simplicity I rely on convenience to be there, technology never to fail, and people to go on how they did the day before and the day before that.

Because as people we are reliant on our past experiences to help predict future outcomes, since yesterday was somehow fine, I am confident that tomorrow will be the same way. This is a human error, how we can be so over-confident on the future based on past results. This is how people can do seemingly silly things based on the appearance of security, the mark of one day being measured by the prior day’s success. What we can see with our failed logic is a pattern that reads similarly to a gambler in a casino, or a thrill seeker in life. The measure of security is not from the precautions we have taken to assure we are immune to threat, but the ignorance of real attacks that might happen in the absence of any precaution whatsoever.

The inability for people who use wireless technology to protect their connection is a gamble that everyone takes. In a study published in the Communications of the ACM, Chenowith, Minch, and Tabor used a college campus to study the behavior (Chenowith, Minch, & Tabor, 2006, p. 135).  The study examined “wireless user vulnerabilities” and “security practices” in an attempt at measuring the users whose connections are not protected (Chenowith, et al., 2006, p. 135). The study also tallied the wireless devices “compromised by malicious applications”, such as viruses, worms, and surveillance software (Chenowith, et al., 2006, p. 135).

Our goal was to directly investigate how well wireless users are securing their computers and the threat level associated with wireless networks. Using a university campus wireless network, we performed a vulnerability scan of systems shortly after users associated to campus access points. The scans were performed using Nmap (www.insecure.org), a popular open source scanning tool. The results of the Nmap scans were used to determine the proportion of wireless users not using a firewall, the prevalence of malicious applications, and the proportion of users with open ports. (Chenowith, et al., 2006, p. 135)

The reason the surveyors used the population they did was its direct representation of use of wireless networks by the general population. Other than user authentication, there are no security measures (such as WEP) in place on the wireless network, although users agree at login that their system patches are current, that they are using an anti- virus program, and that they understand they are subject to university computing policies (Chenowith, et al., 2006, p. 135). If users desire additional security, they must provide it themselves (Chenowith, et al., 2006, p. 135). This environment of minimal network-level security and heavy reliance on user initiative makes the campus wireless network reasonably representative of public hotspot-based wireless networks in general (Chenowith, et al., 2006, p. 135).

Subjects for the study were authorized users of the campus wireless network. The total university population includes 18,599 students and approximately 2,100 faculty and staff. The university is a commuter campus with a non-traditional population of 15,779 undergraduate students (average age 26) and 1,663 graduate students (average age 36), with 54% female and 45% male (1% unspecified). Most students live off campus, and many have part-time jobs or full-time careers, often with one of several local high-tech firms. We view the non-traditional nature of the student subjects as a positive factor for the study as we believe it makes them more representative of the general public and workforce than traditional students would be. (Chenowith, et al, 2006, p. 135)

Since the study is a mirror of the real world, the results are used as a measurement of the steps people take or do not take to secure their wireless connections in the general population.

The results of the study are illuminating. The data of the Nmap scan shows that 304 computers (9.13% of the 3,331 computers) were not using a firewall (Chenowith, et al., 2006, p. 136). Even with a firewall enabled, systems can have open ports (Chenowith, et al., 2006, p. 136).

Since any open port is a potential security risk (Chenowith, et al., 2006, p. 136), the study measured open ports, and found 287 computers (8.62% ) scanned had at least one detectable open port (Chenowith, et al., 2006, p. 136). Of the 287 computers with detectable open ports, 189 (65.85%) had at least one open port with well-known vulnerabilities. Of the 287 computers with detectable open ports, 98 (34.15%) had no open ports with well-known vulnerabilities (Chenowith, et al., 2006, p. 136). Simply put, when a user had open ports, more than 65% of the time at least one of these was a port that posed an important security risk (Chenowith, et al., 2006, p. 136).

The most frequently open ports are also some of the most dangerous. The top three open ports were designed for file and print sharing across computer clusters and can potentially be exploited by attackers through null sessions. (Chenowith, et al., 2006, p. 136)

Individual systems can use “null sessions” (no username or password required) to establish connections between computers using these ports. It is well known within the security community that it is possible for an attacker to exploit null sessions and gain access to a system through one of these ports. (Chenowith, et al., 2006, p. 135)

Malware can do a lot of things, including keystroke logging, username and password detection, and online monitoring of web activity. What this does is allow someone else besides yourself to silently view and capture your personal information, including credit card accounts, personal emails, google search history, and social security number.

A total of 17 computers (0.5% of the computers scanned) had at least one malware application installed. Although a small number relative to the total number of wireless users, the existence of malware is important because any one of these infected systems may be used to launch attacks against the larger client population. (Chenowith, et al., 2006, p. 136)

Many infected computers had multiple malware applications present. Of particular interest, and somewhat alarming, is the presence of network monitoring and packet sniffing applications. Of the 17 infected computers, 12 also had at least one network monitoring/packet sniffing application. The most common network monitoring tools found were Nessus, Bigbrother, and Netsaint. (Chenowith, et al., 2006, p. 136)

Are the vulnerabilities in a system consistent within every user? No. However, on shared networks, the connection is only as secure as its most vulnerable link. In the cases where 17 computers were already infected with malware, these hubs were bastions for potential attacks on every other computer in all 3,331 computers. If everyone is as ignorant as the least protected user, then everyone is under threat of attack.

Is the technology worth the risk? This question is asked in a more meaningful way, especially when users who also carry work laptops and mobile devices with them outside of work expose their company to security breaches. The threat is real, but the question remains. Is it worth it? Do you feel lucky? I am reminded of so many things when I think about this risk, among them an episode of the NBC TV show 30 Rock. In one episode, Tracy Jordan (Tracy Morgan) and Jack Donaghy (Alec Baldwin) are talking about how to change the public’s perception of Tracy.

Jack:

Everyone thought Prince Hal was a drunken wastrel. But when he became king he transformed himself into a wise and just ruler. He changed the headline. That’s what you have to do, Tracy. If you’re open to it, I’m very good at giving advice. For instance, with your obit[uary] problem. You’ve spent years creating a certain public image, but you can change that. You just have to do what Prince Hal did.

Tracy:

You know something, Jackie D? That thing I said earlier about Prince Hal got me thinking. I have to change my headline.

Jack:

Yes, that’s what I just said. Now if I can help you…

Tracy:

No, no, no Jackie D. I don’t need your help. I’m Tracy Jordan. When I go to sleep, nothing happens in the world. (Gentlemen’s Intermission)

Sometimes we all want to be Prince Hal. If we go to sleep, nothing happens in the world. We are not at risk. Nothing bad happens. This is the same approach that so many take when securing their computers at home. If the risk never comes to bear, it all might be best left to chance.

References

Chenowith, T., Minch, R., & Tabor, S. (2010). Wireless Insecurity: Examining User

Security Behavior on Public Networks. Communications of the ACM, 53(2), 134-138. http://eds.a.ebscohost.com/eds/pdfviewer/pdfviewer?sid=043d2ad0-0c4c-47a3-b75a-0d0faef42c18%40sessionmgr4004&vid=1&hid=4210.

Gentleman’s Intermission. (2015). Retrieved from

http://www.30rockquotes.net/seasons/season_5/30rockquotes_gentlemans_intermission.cfm.

The Green Place

Sometimes during the summer I like to go swimming. Oh who am I fooling? I always want to be immersed in some new body of water; it’s like braving a dangerous element in exchange for a feeling I cannot quite express. I cannot breathe underwater, but I can hear something down there that gives me a kind of strange comfort in a world that I cannot fully see but feel and understand with different senses. I will lay belly up with my ears underwater for hours just to look up and wonder at the sky.

Walden pond in Concord, MA, is a bell-shaped freshwater pond, and I’ll head there early in the mornings on some Sundays in August. Around the perimeter, there are these little private beaches, rock steps leading into the water. The place is welcoming and is a bit of a mirage, but it is very much there. I go there so seldom now it often slips from my memory, but it is not my imagination. The water has a magical feeling to it, and gets so warm in summer that it sustains life in different ways. A few summers ago, news outlets reported a strain of jellyfish was living in the water.  What did people make of it, a typically salt-water organism living in fresh water?

“Gwen Acton thought the dime-sized translucent pods she saw … were strange, beautiful seeds that had drifted down to the water surface from some flowering plant” (Daley, 2010).

Melissa Webster said, “We saw them on most of our swims during September, and on our last swim Oct. 1. Definitely cool” (Associated Press, 2010)!

Gsinger said, “I have been swimming in Walden for 30 years and had never previously seen them.  Has something changed” (Associated Press, 2010)?

Chris said, “This is a very scary event. They are a great danger to the native animals in the lakes and to the water. [T]his is a event that should be looked into with great con[c]ern is a sign of the danger nature is in” (Associated Press, 2010)

Although they pose no threat to humans, there was a variety of perceptions. Some were in awe, some were happy to see them, some were frightened, and others feared the apocalypse. The extent to which anything can tell us about being human is our perceptions. Anything we encounter can yield a feeling or impulse which people attach meaning to. It could be a jellyfish or something else.

Years before today our ancestors looked up at the sky and saw the sun and moon and told each other stories about what each one did. There were gods possessing great virtue that ruled the stars. There were heroes and villains entangled in legends of a shared narrative that people thought up which crafted some understanding as to what was going on in the world.

Today we know the sun and moon are planets that orbit our solar system. We have the answers to what we think are the big questions. We think we have it all figured out, and to a large extent, we do. A lot of what we place meaning on will propel us into the next age. Sometimes a jellyfish is bad, other times he is a fascination. What we do with these meanings will be reflected in our behavior.

This week I have been reflecting on how people think about things, namely bad things. Oops, there I’ve gone and placed meaning to something without thinking. This week I have been thinking a bit about computer security vulnerabilities and how they are perceived. What I found reminds me of the problems I have faced in my own life, and how those problems were thought of in that time. In retrospect, with what I know now about it, I consider these initial impressions to have fallen away or changed somewhat in my thinking.

The research I have done has changed my view of computer vulnerabilities as not something to fear for loss of one’s data, but rather an industry unto itself in a larger narrative that does not involve me directly, but rather my data and personal information. I am part of the bigger war, but I feel less a participant in it, less in control of my own personal information, and oddly more at peace with computer vulnerabilities than I thought I would.

I have often thought about it for a moment and forgotten it, a shell picked up on the shore and thrown back into the sea. I have stored up moments that I would like to feel have given me a sense of something, but then I remember how things in the past exist only in your mind. How you cannot take it out and measure it, but you can place a meaning onto something and that can be everything that it is now, so that all that is left is your impression of what used to be.

Sometimes your own conscious thought dissolves into the layers of waves how an ocean does, in the sense that your cognitive awareness doesn’t even have surface tension, so the points of arrival and departure are always changing, always different and new, reflective. This is how one person can have a thought about a thought. This sense, this wondering is an unfolding of constantly redeemed perks or chits that have no expiration date and whose value is changing in relation to meaning. These things may be only in the mind, but they are without a doubt changing and evolving as a direct function of one’s thinking and reflexive mind.

Are there known vulnerabilities in software that are rolled out without adequate testing? To answer this question, one need only think of one’s own individual computer experiences. Have you ever had a hard time using a computer? Yes. I have an impression of Microsoft that runs from a history of their long and confusing installation process, which makes it more challenging to add software to your computer. “On the Windows desktop, users have to open their browsers, search the web, download an application from a website, and install it manually” (“HTG Explains, 2015).

As a result, there are a number of things that can happen, including security breaches.

Many less-savvy users may end up downloading dangerous software or clicking a fake “Download” button that leads to disguised malware. Users may download and run potentially dangerous types of files, such as screensavers, without knowing that they contain executable code and can infect their system. People downloading pirated software from questionable websites may end up infected. (“HTG Explains,” 2015)

In comparison, Apple, whose built-in features make it simple to add programs, one can simply press a button and voila! The program is added without so much as pointing a finger and clicking. Apple users install applications and software “that come from a trusted, centralized repository. Users open their app store or package manager, search for the program, and install it” (“HTG Explains,” 2015). I have reflected on these two experiences, the Microsoft design and the Apple design, and balked.

Who would ever pay money on the experience of Microsoft? It creates a product in which outside services and repair are almost a requirement. There is a much higher risk of viruses as PC has been the target of the majority of worms.

Windows XP shipped without a firewall enabled and network services were exposed directly to the Internet, which made it an easy target for worms. At one point, the SANS Internet Storm Center estimated an unpatched Windows XP system would be infected within four minutes of connecting it directly to the Internet, due to worms like Blaster. (“HTG Explains,” 2015)

In addition, Windows XP’s autorun feature automatically ran applications on media devices connected to the computer. This allowed Sony to install a rootkit on Windows systems by adding it to their audio CDs, and savvy criminals began leaving infected USB drives lying around near companies they wanted to compromise. If an employee picked up the USB drive and plugged it into a company computer, it would infect the computer. And, because most users logged in as Administrator users, the malware would run with administrative privileges and have complete access to the computer. (“HTG Explains,” 2015)

Part of the problem is the intention of Windows’ original design. “Historically, Windows was not engineered for security. While Linux and Apple’s Mac OS X (based on Unix) were built from the ground-up to be multi-user operating systems that allowed users to log in with limited user accounts, the original versions of Windows never were” (“HTG Explains,” 2015).

DOS was a single-user operating system, and the initial versions of Windows were built on top of DOS. Windows 3.1, 95, 98, and Me may have looked like advanced operating systems at the time, but they were actually running on top of the single-user DOS. DOS didn’t have proper user accounts, file permissions, or other security restrictions. (“HTG Explains,” 2015)

Despite, or perhaps due largely in part to the existing vulnerabilities, there is an opportunity to profit on it. When one is looking at the framework of insecure computer systems as an object not of dread and avoidance, but rather a part of a larger economic system, things start to look different. The research I came across includes an article published in Time magazine from July 2014 titled “The Code War”.

The idea that a software bug can be worth actual dollars and cents is an odd one. Bugs are mistakes; people generally pay money to fix them. The fact that there’s a market for them is a consequence of the larger oddness of our present technological era, in which our entire world — our businesses, medical records, social lives, governments — is emigrating bit by bit out of physical reality and into the software-lined innards of computers in the form of data. A lot of people are interested in that data, for reasons both good and bad. Some of those people are spies. Some of them are criminals. Bugs are what they use to get at it. (Calabresi, Frizzel, & Grossman, 2014)

The Time article interviews Aaron Portnoy, co-founder of Austin-based Exodus (Calabresi et al., 2014). Portnoy’s career began as a high school student where he hacked into computer system at the Massachusetts Academy of Math & Science in Worcester (Calabresi et al., 2014). Where Aaron’s initial hacking career dovetails in with his current project is his love of hacking.

Portnoy, now 28, is the co-founder of a two-year-old company in Austin called Exodus Intelligence. Its mission statement reads, “Our goal is to provide clients with actionable information, capabilities, and context for our exclusive zero-day vulnerabilities.” Which means — translated from the quasi-paramilitary parlance that’s endemic to the software-security industry — that Exodus Intelligence finds and sells bugs, specifically the kind of bugs that could potentially give a third party access to a computer, the same way Portnoy got access to his high school’s network. They’re worth a lot of money. Vulnerabilities in popular applications and operating systems have been known to change hands for hundreds of thousands of dollars each. (Calabresi et al., 2014)

The industry of computer vulnerabilities is an enormous and international one. For example:

in May [2014] when the U.S. indicted five members of the Chinese army for stealing data from American companies, including Westinghouse and Alcoa. That wasn’t an anomaly; it’s the norm, and it’s getting more normal all the time. Retired Army general Keith Alexander, who formerly headed both the NSA and U.S. Cyber Command, has called China’s ongoing electronic theft of American intellectual property “the greatest transfer of wealth in history.” Two weeks ago several security firms confirmed that a group believed to be backed by the Russian government has been systematically hacking the U.S.’s energy infrastructure since at least 2012. According to IBM’s security division, the average American company fielded a total of 16,856 attacks in 2013. (Calabresi et al., 2014)

The history of computer vulnerabilities goes back twenty years.

In 1995 Netscape announced a “Bugs Bounty” program that paid cash to anybody who could find flaws in its browser. The company … just wanted to fix holes in its software. In 2002 a security firm called iDefense started buying up vulnerabilities of all kinds; another company, TippingPoint, launched a similar program in 2005. Both programs were created as alternatives to the increasingly active and chaotic exchange of zero-days on the open market — essentially they acted as safe zero-day disposal facilities, a bit like radioactive-waste repositories. If you found a bug, instead of selling it to the highest bidder, who would do God knows what with it, you could sell it to iDefense or TippingPoint for a reliable price, and they would alert their clients to the problem and work with the software vendor to get the bug patched. iDefense and TippingPoint had something else in common too: they both, in successive years, 2005 and 2006, hired an intern named Aaron Portnoy. (Calabresi et al., 2014)

What Portnoy does now is not so different from his internship at TippingPoint. At Exodus, nine engineers sit at computers all day:

banging on software looking for ways in: browsers, email clients, instant-messaging clients, Flash, Java, industrial control systems, anything an attacker could use as an entry point. “One thing we try to maintain is a capability in every major backup software out there, because that’s one of the juiciest targets,” Portnoy says. “If you get on an enterprise network, what is an administrator going to want to protect? Their important information. What do they use to protect that? Backup software.” (Calabresi et al., 2014)

When a researcher at Exodus finds a vulnerability, he or she types it up in a professional-looking report along with technical documentation that explains what it does, where it lives, what it gets you, how to spot it, what versions of the software it works on, how one could mitigate it and so on. Most important, Exodus provides you with an exploit, which is the procedure you’d have to follow to actually trigger the bug and take advantage of it. “Every single vulnerability that we give our customers comes with a working exploit,” Portnoy says. “If we can’t exploit it, we don’t even bother telling anyone. It’s not worth it.” Voilà, one freshly minted zero-day vulnerability. (Calabresi et al., 2014)

Portnoy takes pride in the superior quality and effectiveness of Exodus’ exploits. “We try to make them as nasty and invasive as possible,” he explains. “We tout what we deliver as indicative of or surpassing the current technical capabilities of people who are actually actively attacking others.” When a company hires Exodus, it does so on a subscription basis: you get a certain number of bugs a year for such-and-such amount of money. Subscriptions start at around $200,000. (Calabresi et al., 2014)

The vulnerabilities business has a mixed reputation, based on the presumption that the bugs it provides are being used for criminal or unethical purposes. A Washington, D.C., company called Endgame that sold vulnerabilities to the government for years was dubbed “the Blackwater of hacking” by Forbes magazine. Last year, when Endgame announced that it was getting out of the game, it did so triumphantly, as if it were kicking a heroin habit: “The exploit business is a crummy business to be in,” its CEO said. (Calabresi et al., 2014)

The reality is more complex. Exodus’ clients come in two basic types, offensive and defensive. Playing for the defense are security firms and antivirus vendors who are looking for information they can integrate into their products, or who want to keep their clients up to speed on what threats are out there. On offense are penetration testers, consultants who use Exodus’ zero-days to play the “red team” in simulated attacks on their own or other people’s networks. “If they want to show what a real attack would look like from a determined adversary,” Portnoy says, “we give them the tools to do that.” (Calabresi et al., 2014)

As far as one confirmed fear will take you, there is comfort in the fact that many computer vulnerabilities, malicious bugs, and computer worms exist, as part of the general landscape always will, for a hefty profit. As the author Chuck Palahniuk writes, men will be “slaves to the IKEA nesting instinct” (Kopal, 2009). Women will believe they are less than gorgeous beasts until they consume millions of dollars in beauty products to make them whole again. Kids will buy all the music they want knowing that they are what they like, not who they are based on personality, values, or behavior, found and maintained through authentic human interaction or genuine relationships with other people.

One can sleep soundly by taking a sedative of life, by truly not worrying over the thought that your online life is being exploited by some personal vendetta or a deeper need to defame your character. Those fears are for the people who buy and sell the glitches that get our data. It puts to bed all the paranoid claims that lie awake with you at night, toiling, wondering as you look up at the ceiling. To get out of this half make believe world made mostly of wires, one must find a green place, free from the dry desert you once thought was the barren lands. You must stop searching out there in the mirage and come home. Know you are part of the wasteland, and be thankful it’s not just all about you.

Related articles:

https://ourtrickstime.wordpress.com/2015/09/16/the-top-five-hacking-tool-of-2015/

https://dguiney.wordpress.com/2015/09/15/hacking-team-computer-vulnerabilities-and-the-nsa/

  References

Associated Press. 2010, Sep. 10. Mysterious Jellyfish Invade Walden Pond. Retrieved from

http://www.wbur.org/2010/09/10/mysterious-jellyfish

Calabresi, M., Frizzel, S., & Grossman, L. Jul. 21, 2014. The Code War. Time. 184(3), 18-25.

http://search.ebscohost.com/login.aspx?direct=true&db=f5h&AN=96981364&site=eds-

live

Daley, Beth. 2010, Sep. 10. Mystery Blooms on Walden Pond. Retrieved from

http://www.boston.com/news/science/articles/2010/09/10/mystery_blooms_on_walden_pond/

Kopal, Indira. 2009, Oct. 19. Tyler Durden’s anti-consumerism quotes. Retrieved from

http://indranikopal.blogspot.com/2009/10/tyler-durdens-anti-consumerism-quotes.html

HTG Explains: Why Windows has the Most Viruses. 2015. Retrieved from

http://www.howtogeek.com/141944/htg-explains-why-windows-has-the-most-viruses/

code page 437

I think about some things in my life that have fallen away or entirely changed, and technology is certainly one of them. When I was in middle school, it was nearly the dark ages. I wrote my first research paper on the effects of smokeless tobacco. To gather research I went to the Northeastern University library at the Burlington, MA, campus. My mom and I would go together after school. She would drive me to the library. I would photocopy all my sources, then we would pile in the car, and head back home.

The resources I used are the brilliance of the caveman’s first paintings, well preserved as a record in time. The renderings of our ancestors that our recent relatives hang in the Smithsonian museum are not unlike the relics of my youth, which can likely still be found in any present-day library procured by modern-day librarians: books, encyclopedias, and microfiche.

To write back then I used a family computer that ran on MSDOS, with a black screen monitor that displayed orange letters in a font called code page 437 (“Code”). I printed out my final draft on a dot matrix printer. I think about the resources I used then and how much things have changed.

What was once a multi-venue, multi-resource process has been condensed down into something much more simple. At home I have my laptop, computer, and printer. I don’t need to go outside of my home, or outside my one device to gather research. I can simply go online. I go to google or use ebscoHost to research my papers. This convenience has eliminated time and money and has liquidated the process of research into something effortless. I simply think of what I need, point and click, and I’ve got what I need. Writing and research is simply a whim at my fingertips.

When I think about Steve Jobs, initially I remember my first Apple technologies. The day was dawning. The first laptop I ever owned was an iMac laptop and iPod I got as Christmas presents. I used the iTunes application on my computer to upload songs to my iPod, which allowed me to listen to music in my car or on headphones while running. I think about Steve Jobs and I think about the technology, as I should. They weren’t necessarily interchangeable but in my mind they go hand in hand. Jobs was the keynote speaker for new Apple products. The procession congeals the Apple computer in a kind of sentience with Jobs that I think will remain in my heart for a long time, in his memory.

I do not know how these technologies would have formed in a different world. In a parallel universe these products may have originated on a different timeline or found their success from different makers. Had it been served on a plate with a glass of cold milk and chocolate chip cookies, I would have attributed its creation to Santa Claus, but it did not come about this way. Steve Jobs is credited with much of the commercial success of the Apple product line. His ability to speak and develop excitement has generated sales and success and as a result he has won countless awards.

Steve Jobs’ interest in typography directly contributed to the inclusion of type set design and writer’s applications, including MacWrite and MacPaint in some of the first Mac computers (Peterson).  His love of calligraphy and the study of the formation of letters in different fonts, with proper proportion and spacing, fascinated Jobs (The Apple History Channel). His inclusion of writing and research tools in his computer designs helped facilitate the revolution of the research and writing process.

After Steve Jobs was fired from Apple, he founded NeXT in 1985. NeXT introduced the first NeXT Computer in 1988, and the smaller NeXTstation in 1990. The NeXT computers experienced relatively limited sales, with estimates of about 50,000 units shipped in total. Nevertheless, its innovative object-oriented NeXTSTEP operating system and development environment were highly influential (“NeXT”).

The Next computer used the first graphical user interface and dynamic page generation design. The systems also came with a number of smaller built-in applications such as the Merriam-Webster Collegiate Dictionary, Oxford Quotations, the complete works of William Shakespeare, and the Digital Librarian search engine to access them. The NeXT computer was used to create the internet, where, in 1990, Tim Berners-Lee used Next to construct the first web browser and web server. (“NeXT”)

Jobs started Next and Pixar, which made the first computer animated film, Toy Story (“NeXT”). Pixar is the most successful computer animation studio ever (“NeXT”). Apple bought Next and they use their ideas for apple technology renaissance (“NeXT”).

Apple started focusing on integrated software for personal devices like cameras, camcorders, and PDAs. This was known as the Digital Hub Strategy; where, different devices and media link together sharing data and common functions. This worked really well for everything on the market except for digital music players. The devices were clunky and had pretty bad user interfaces, so to fix the problem Steve Jobs had the Apple engineers design a new music player, the iPod. The iPod came out in 2001 (“Apple Press”).

With its essential integrated software counterpart, iTunes, it was the product game changer that Apple needed to surpass its competition. The iPod once again showed Steve Jobs’ design values. It was clean, white, simple and elegant. It also introduced a mobile device user interface to the industry. This was the stepping stone to the next product that would again change the world of design. (Peterson)

Steve Jobs revolutionized the way that people use technology to access and learn about new music. What was once walkmen: cassette players and CD players tethered to the ears by headphones were static resources that could play one album at a time. IPods allowed a user interface to find new music. ITunes recommended new music with their predictive algorithms. With an iPod one could store an entire library of music with thousands of songs, as opposed to one album. On February 12, 2012, Jobs was posthumously awarded the Grammy Trustees Award, an award for those who have influenced the music industry in areas unrelated to performance (“Steve Jobs”).

In January 2007, the clean, sleek, simple, and elegant iPhone was introduced to the world. Not only did the iPhone (and 3 years later the iPad) jumpstart a whole new industry standard, Apple again, opened another new media platform for design professionals known as Mobile App Design (Peterson). Steve Jobs’ designs have inspired the way the media works. Most people can access news headlines from the likes of CNN, New York Times, BBC, with mobile applications. This has transformed news ratings systems from a TV platform to an individual window that anyone with an iPhone, smartphone, or other personal device can use.

Jobs was awarded the National Medal of Technology by President Ronald Reagan in 1985, with Wozniak (among the first people to ever receive the honor), and a Jefferson Award for Public Service in the category “Greatest Public Service by an Individual 35 Years or Under” (also known as the Samuel S. Beard Award) in 1987. On November 27, 2007, Jobs was named the most powerful person in business by Fortune magazine. (“Steve Jobs”)

In August 2009, Jobs was selected as the most admired entrepreneur among teenagers in a survey by Junior Achievement, having previously been named Entrepreneur of the Decade 20 years earlier in 1989, by Inc. magazine. On November 5, 2009, Jobs was named the CEO of the decade by Fortune magazine (“Steve Jobs”).

In November 2010, Jobs was ranked No.17 on Forbes: The World’s Most Powerful People (“Steve Jobs”). In January 2012, when young adults (ages 16 – 25) were asked to identify the greatest innovator of all time, Steve Jobs placed second behind Thomas Edison (“Steve Jobs”).

In March 2012, global business magazine Fortune named Steve Jobs the “greatest entrepreneur of our time”, describing him as “brilliant, visionary, inspiring”, and “the quintessential entrepreneur of our generation” (“Steve Jobs”).

Two films, Disney’s John Carter and Pixar’s Brave, are dedicated to Jobs. Steve Jobs was posthumously inducted as a Disney Legend on August 10, 2013 (“Steve Jobs”).

There is no question these awards and accolades have only scratched the surface of the legacy of Steve Jobs. Jobs was a man, and stands as a testament to the obelisks that the primates have encircled and made entirely their own. It is the discovery of fire that our Cro-Magnon forefathers came upon perhaps by chance, inevitably to find warmth. Sometimes people don’t criticize what they can’t understand, such as the brilliant engineer Steve Jobs. Who could fault him? He was brilliant. But the salesman Steve Jobs has been accused of so many crimes.

There’s hardly a cliche in the leftist lexicon liberals couldn’t have applied to Jobs and his customers: commodity fetishism, false consciousness, objectification and alienation, manufactured wants, the marketing of desire, and, most obviously, planned obsolescence. This last is the hoary charge from mid-century that American businessmen designed a product so it would soon be superseded by a similar product, compelling consumers to buy, buy, buy (Ferguson).

There is something that aligns perfectly with a formula that is inherent in us all. I am still trying to extrapolate. The variables are comprised of the American dream, entrepreneurship, and the capturing of what people all wish they had: tons of money. There is no question in my mind that Steve Jobs is the world’s greatest capitalist, yet to be toppled by the next big thing. The remains of this insight help me to step from this dark precipice to determine what many hold as a symbol of greatness, when, in essence, we have been enculturated to know this greatness is inside us all. To understand the idea of Steve Jobs, or any cult of personality that we might think is great, is to understand our own flawed notions of greatness.

In season three of Community, the community college study group is fiercely recruited by Glee club director Mr. Rad, played by Taran Killam, who, after losing the original members to a collective mental breakdown, preys on the group. “Glee, it’s like a drug that you use that turns pain into shoes and your shoes into dance” (“Baby Boomer”). The hypnotic song master croons to an unsuspecting target. Abed, played by Dani Pudi, sings, “Glee is what’ll spread to my friends like a virus that sends them to a healthier place” (“Baby Boomer”).  The infected members of the study group infect others and double their efforts, eventually forming a complete glee club that will supposedly go on to perform at regionals.

After Abed infects his best friend Troy, played by Donald Glover, they turn Pierce Hawthorne, played by Chevy Chase. Pierce, an aging baby boomer, is particularly vulnerable due to his demographic’s “well-documented, historical vanity” (“Baby Boomer”).

“You, Pierce? Your generation invented music” (“Baby Boomer”). Pierce responds, “I don’t know about invented; perfected maybe” (“Baby Boomer”). The ensuing anthem heralds Santa, having been part of Pierce’s peer group, who “fought at Woodstock and Vietnam, smoked a ton of acid and burned his bra” (“Baby Boomer”). The song credits Santa with the advent of “Spielberg and microchips” (“Baby Boomer”). Santa “invented Coca Cola and aerobics” (“Baby Boomer”). “He made the iron curtain and the Gremlins, too, fake butter and AIDS, and Twin Peaks” (“Baby Boomer”).

Pierce cuts in, singing, “You’re welcome for everything in the world. I’m Baby Boomer Santa, I bring the gift of the world” (“Baby Boomer”).

The remaining uninfected of the study group back out of the study room together, shaking their heads, promising each other it will never happen to them. They all became infected in the coming days, all succumbing to Glee.

Glee in this analogy is the hype that surrounds anything, the social distortion that echoes around something huge, the sequel trilogy to the Star Wars movies. They weren’t that great. I still love IV, V, and VI the best. They all have this place in my heart that will not be tarnished by the ensuing onslaught of the prequels. The first of the post-quels, Star Wars VII, is something I am eagerly anticipating, coming December 2015. It’s supposed to be great: “more practical effects, less CGI; captured on film, not digitally; and it will feel more authentic” (Mentel). It will have to be better than the prequels, in my hope of hopes.

In addition to the hype of inevitably bad movies, I think of incumbency as another useful marketing tool. “The percentage of incumbents who win reelection after seeking it in the U.S. House of Representatives has been over 80% for more than 50 years, and is often over 90%” (“The Power”). If he’s been in office for one term, what would another term hurt? The devil we know is safer than the devil we don’t. And that devil continues on to a lack luster second term, occupying space instead of breaking records or blowing our minds. “True, things can definitely get out of control when frothy-mouthed marketers promise life-changing miracles to get all of us to take notice” (Stapleton).

Regardless of the bad movies, there have been some great ones. And of the presidents we hold up as the top five, there are always some that we have marked off that list. But there will always be something else, something more that we have not seen or cannot see, not without the hype that extends to the world hope in the form of something we wish upon that grants us more wishes until we have witnessed something grander than all our expectations. And the fact that we saw it happen means that we all were there and we were a part of it. And it will be held up in the annals of time as something historical, important beyond words can comprehend.

Related articles:

http://blog.paris.id.au/2015/05/22/the-reality-distortion-field-has-never-been-so-strong/

http://daringfireball.net/2015/04/the_apple_watch

References

“Apple Press Info.” Apple. Apple Inc., 2015. Web. 9 February 2015.

https://www.apple.com/pr/products/ipodhistory/.

“Baby Boomer Santa.” Wikia. Wikia, n.d. Web. 22 Feb. 2015. http://community-

sitcom.wikia.com/wiki/Baby_Boomer_Santa.

“Code page 437.” Wikipedia. Wikimedia Foundation, Inc., 7 January 2015. Web. 9 February

  1. http://en.wikipedia.org/wiki/Code_page_437.

Ferguson, Andrew. “The Steve Jobs Snow Job.” Commentary 132.5 (2011): 80. Print.

Mentel, Thomas. “8 Reasons Why Star Wars VII is Destined to Please.” The Cheat Sheet. The

Cheat Sheet, 10 Nov. 2013. Web 22 Feb. 2015. http://wallstcheatsheet.com/stocks/8-reasons-why-star-wars-episode-vii-is-destined-to-please.html/?a=viewall.

“Next.” Wikipedia. Wikimedia Foundation, Inc., 23 January 2015. Web. 9 February 2015.

http://en.wikipedia.org/wiki/NeXT.

Peterson, Vicki. “How Steve Jobs Influenced the Modern World of Digital Design.” Symantec.

Symantec Corporation, 8 February 2013. Web. 9 February 2015.

http://www.symantec.com/connect/blogs/how-steve-jobs-influenced-modern-world-digital-design.

Stapleton, Dan. “Opinion: Hype Isn’t Always A Bad Thing – Hype Is Hope.” IGN. IGN Games

Newsletter, 30 Jan. 2015. Web. 22 Feb. 2015. http://www.ign.com/articles/2015/01/31/opinion-hype-isnt-always-a-bad-thing-hype-is-hope.

“Steve Jobs.” Wikipedia. Wikimedia Foundation, Inc., 9 February 2015. Web. 9 February 2015.

http://en.wikipedia.org/wiki/Steve_Jobs#Honors_and_public_recognition.

The Apple History Channel. “Steve Jobs Commencement Speech 2005.” Youtube. Youtube, 6

March 2006. Web. 9 February 2015.

“The Power of Incumbency.” Boundless Political Science. Boundless, 02 Jul. 2014. Web. 22

Feb. 2015. https://www.boundless.com/political-science/textbooks/boundless-political-science-textbook/congress-11/congressional-elections-81/the-power-of-incumbency-446-1638/.