“You speak. Siri helps. Say hello to the most amazing iPhone yet.”
So the ad on the back of the April 2012 edition of Wired magazine reads. We’re left linger on that last word: yet. Nevermind the agency we’re assigning to Siri, the overly-hyped, distinctly inhuman voice recognition software that yearns to sell you more widgets[G1], focus instead on that triumphant little three letter word. We are assured this is not the first iPhone, nor will it be the last. With each model we gain more features, claim more battery/bandwidth and obsess over the nuances of the shape of the device. And yet there is an absent dimension to this story. This American Life and Mike Daisey (2012) did well to bring a lot of attention to variety of scaling costs to workers that go into one of these gadgets and Johnathan Zittrain (2008) established years ago Apple’s ongoing project to demolish the open web and computer interoperability in favor of compartmentalized proprietary, high-profit-yielding apps. Both of these would seem to me to be steps backwards. So why then do we so easily accept it as a “most amazing” technological evolution in what is most likely assumed to be an inevitable story?
Progress. Or, more accurately, the false progress narrative that everyone seems to have had stuck in their head since the 50’s. Statements like “the evolution of technology” are just what continue to give it life today. It’s a smaller piece of a greater whole I’m afraid, a component (or proponent?) of technological determinism—the dominant, impetuous voice given to the presentation of technology and its relationship with society that always takes the form of continual self-contained (and self-evident) progression. The age-old classic might be the generalized (read: wrong) form[G2] of Moore’s law, the notion that computing capability accelerates in a predictable (periodic and exponential) and inevitable fashion. Faceless, decontextualized “technology” is seen to be a force of change because the darned stuff is somehow always causing itself to increase in speed and effectiveness, relative to how it was, of course. The net result of defining technology as progress is that society must adapt to it instead of shape it. Some say that technology is merely the application of science. Us informatics scholars claim this is faulty because scientists create as much as they ‘discover’ and, historically speaking, science and technology have not had clear-cut or consistent connections. And this proposition that technology simply builds upon preexisting technology is a misnomer (not to mention it breaks Kuhn’s heart). Neither is technological progress an eternal project of addressing reverse salients, unforeseen setbacks or problems [G3]resulting from the unfolding of technologies over time, because not all technology is designed to correct problems caused by previous technology. To be sure, human values are definitely embedded in technologies. Not only was the nuclear bomb a reflection of the values, capabilities and agendas of its time [G4]but it really could be used in only so many ways. Technology comes about as a result of human ideas and agency; the direction it goes and the effects it has is largely up to us. Now I’m not saying technology is purely socially constructed either. In some ways it may have limited agency, like the way a dead person’s Facebook profile might be continually animated by algorithms and interactions. But at the end of the day we are the ones who make sense of what it all means.
A lot of work on the subject of power and our current “information” society examines people’s ability to participate in it meaningfully, be it as part of global conversations, local democracy, or broad movements of social change[G5]. This assertion assumes that participation boils down to a matter (requirement) of information access, known commonly as the digital divide, or, stated succinctly, the power differences between people or communities tied to varying levels of computer and internet opportunity.
Establishing the digital divide as our enemy necessarily embarks us on a quest for digital solutions, but the lack of possession of material access to information technology as well as the absence of skills, community support and perceptions to make effective use of it is certainly a symptom of deeper, more prolonged issues. In the information revolution the have-nots are those who are simply digitally divided. Why do we forget to think about what caused them to be digitally divided in the first place? In some sense the digital divide is a moving target, as the make-up of information communication technology shifts as we look back over time. In this sense we’ve been in an information revolution (or crisis) for over thirty years. To suggest the information revolution is a regularized state of being is to render the term inadequate[G6]. But truthfully it just keeps getting used. First it was the onset of significant availability of computers in business and homes in the 80’s, then it was the beginning of widespread internet adoption that broke out in the 90’s and in the recent decade it has donned the hats of mobility, broadband and Web2.0. Up next might be the semantic web. It is worth taking a step back, disentangling oneself from the ever-changing constitution of ICTs, and interrogating the underlying assumptions and agendas of the digital divide and the credence for the proliferation of ICTs that we find wrapped up in the idea of the information revolution.
One might follow the lead of Jan Pieterse (2005), who questions the motivation behind the digital divide in his critique of information communication technologies for development, or ICT4D. His argument depends on the frame of digital capitalism, a world in which networks of corporations drive and dominate cyberspace and subject the world to certain flavors of media as well as the brunt of larger forces, like consumerism. ICT4D implies the imposition of flawed (or loaded) developmental models, such as the aforementioned technological determinism or neo-liberalism (market forces are development) that serve to mask the true intentions of insidious political and economic agendas: to make money off of poor people through selling more material goods and exploiting labor, to control markets with ideologies like intellectual property rights and to force developing countries to choose between dependence on NGO’s or corporate networks. Pieterse’s stance is accurate, if resoundingly pessimistic, and reminds us of the complex of baggage we drag with us when we deploy ICTs to ‘bridge the divide’ between peoples as we “progress” in the alleged information revolution.
This is why I prefer to shift the conversation to literacy. In the vernacular, literacy often is taken to be equivalent to competency, proficiency or functionality, and is frequently affixed to other words to create compound meanings[G7], such as information literacy, (new) media literacy, and stranger and debatable pairings, such as emotional literacy. I take the term a step further than competency. As an educator I advocate for literacies that affects power. Literacies comprised of social practices that foster critical social awareness as well as measurable knowledge and creative command over relevant communicative[G8] tools. Students who can accomplish some degree of mastery over these literacies are able to look at phrases like the evolution of technology or the information revolution and see them for what they are: political positions inscribed in terms that obscure the tangled masses of sociotechnical forces in operation behind them. These same students can go on to actively create, share and remix[G9] information, media[G10] and ideas to be a conscious and intentioned part of the drive behind the information revolution or technological evolution, as the case may be.
How to foster such literacies, however, is another subject entirely, and will have to wait until next time.
How we pace ourselves into this future of (r)evolution?
Tag Archives: iphone
Why You Should Be Afraid of Apple
I often have people ask me why I’m not a fan of Apple. I thought I’d try to gather my thoughts into something with a bit of organization. I had hoped it would be short, but it turned into a long rant. It’s also pretty coarse. Here we go:
- Practical: They want to control as much as they can to cost you money… and freedom.
- Ideological: They don’t want information to be a public good, they want it to be a monopolized commodity.
- These philosophies, in turn, get implemented in their products.
Simple as that. Not enough? Let’s go through some examples, starting with this one:
http://techcrunch.com/2010/04/28/jon-stewart-rips-into-apple-over-lost-iphone-debacle-thats-going-to-leave-a-mark/
This post is a little harsher and has a more aggressive tone to it. Just envision me on a soap box and take it for its points of interest, not religious doctrine.
The iPhone trap and the new iPad threat
First, let’s tackle the mobile.
The App Lock
Johnathon Zittrain outlined this issue in 2008. If you buy an Apple phone you have to get/buy applications and music for it from iTunes. They limit what apps can be created and sold, and limit which devices they can be used on (own more than 5 devices in your lifetime? Tough luck, buy all your stuff again). If you buy a new Android phone later you have to then get/buy an entirely new set of applications for that phone. This means people who buy apple are locked into using Apple, because switching out of it costs you a lot of time and money. The same goes for Android, but at least it’s supported by all carriers and has many more phone manufacturers.
iPads are essentially big iPhones. People are starting to use them instead of laptops. They inherit all of the constraints of the iPhone, which means they are primarily sites of information consumption (not creation), and this consumption happens only on Apple’s terms. If they become as pervasive as laptops we will have a very large body of users who will be at a disadvantage. The social norms of computing will begin to change for the worse. Instead of buying and using software however, whenever and wherever you want, you will have to use it however, whenever and wherever Apple wants. They will find ways to make you pay money for various kinds of use, and you won’t have a choice to use something else, unless you want to pay a lot of money.
Alternative: Run an OS like Windows 8 (not available yet, but I’m investing all my hope in it) that will run on all hardware platforms (phones, tablets, game consoles, computers) and run any software built by 3rd party developers. Just like a real computer. On more powerful devices (computers, tablets like iPads) you will probably even be able to run a virtual machine to run Apple software!
Another Alternative: Jailbreak your iPhone. Install whatever you want. Sell it unlocked years later for more than you got it for.
Yet Another Alternative: It seems that Apple’s high-price and unreasonable restrictions are causing it to start to lose the tablet battle, as the Nook and Kindle push their way in. While they still lock you into media library collections, these devices can have Linux installed on them…
The Hardware Debacle
iPhones are not modular. You can’t upgrade memory or [easily] switch in a new battery (unless you pay them mad $$$$ or void your warranty). The USB/power adapter for an iPhone only works for iPhones. You are unable to use an iPhone as a mobile hard drive.
Alternative: Apple could make it easy to switch out the battery, use micro-SD or common memory types that can be used in many devices, and make the USB connector something universal, like mini or micro-USB. Allow for users to partition up to 80% of the storage space to be usable as a mobile hard drive.
Redefining Information
Historically Apple has asserted control over iPhones that eliminate choices. They used to be an AT&T-only device, and now only have a few carriers. They used to not allow multitasking on phones, like if you wanted to listen to music, surf the web and deal with text pop-ups at the same time. And they still do not support Flash, blocking users out of a major form of web content. iTunes infests every file you download with digital rights management (DRM) that makes it only usable in certain conditions.
Why does this bother me? Well, let’s think a minute about the market, economics, and information. On the one hand we’re seeing Apple prevent healthy competition. On the other hand we’re seeing them impose artificial limits on information to make it a much more easily controlled commodity, one that they can have exclusive access to. What are the economic qualities of information? Presented in three relevant sets:
- Easily searchable (harder to exclude users)
- More persistent (arguably non-rivalrous)
- Easily replicable (low marginal cost)
The traits make it easy to argue that information could be a public good (non-exclusive, non-rivalrous, with low marginal cost and likelihood of positive externality). Apple wants to be the one who controls all replication, search and persistence.
- Information technology (IT) benefits from network effects (the more people with Facebook or Windows the more valuable it becomes)
- It crosses many genres (e.g. we feel uncomfortable with commodifying “personal” information)
- Its true value is determined by use or knowledge
Apple wants everyone to use Apple, so that the value of their product is higher.
- It is easy to create, but hard to trust
- It is easy to spread, but hard to control
Apple doesn’t want to promote creation (unless it is on their terms, on their devices and within their constraints) and definitely wants to control the spread of information. It is trying to counter the social norms and economic possibilities of information.
Alternative: I’m not asking for a completely socialist system, or a purely open market. I’m asking for Apple to not be a jerk. Let me buy a song or app and use it on any device I own. I should be able to make an audio recording on my phone and transfer it to my computer without buying some expensive app to do it. Apple should support common standards like Flash, but allow for new competition, like running other OS’s on iPhones or running iOS on non-Apple hardware. Interoperability and negotiated standards are what benefit consumers the most!
The MacBook Monster
Now, let’s talk about Mac computers.
Mac People
Are you one of us?
There are extremists out there when it comes to any kind of opinion or position related to something. I’ve certainly met “Linux people” who wonder why you would ever use a graphical user interface (aka colors, shapes, buttons, desktop, icons, pictures, etc…). There is no single character to “Apple” people, exactly, though I think that Apple’s marketing would indicate one. I’ve only played around with a little bit of content analysis of Apple ads, but they’re usually white and well-off, which is really no surprise or difference from most IT advertisement. What I’ve actually gotten more of in person from “Apple users” is condescension. It’s the “we’re better-than-you” club because we know to buy macs because they’re better. Why? “Because they’re pretty. And not PC’s. DUH, gawd, get with the times you narc-bum.” In America we get wrapped up in this tangled mess of consuming as a way to augment, project or otherwise construct your personality and identity. I can’t tell you how many times I’ve heard people say “but graphic designers use macs because they’re better for that.” Graphic designers are using the same programs (e.g. Adobe) on Mac or PC, and actually I’m convinced dual-display is better than a single iMac display any day. The real reason graphic designers use Macs is because other designers do and they learn on them, not because they’re better in any substantively measurable way for design. I also occasionally get the “I use Macs because I’m different” response. This feels a lot like when 500 sorority girls walk down the street all wearing the same T-shirt that says “Be Greek, be Unique.” Apple is incredibly normative, if you want to be truly different run Linux.
Mac fans sometimes carry the vestiges of the “Thank god I switched away from Windows XP” with them. Yes, Windows XP is a bad operating system, and no this is not news. It’s a decade old and, by comparison, now terrible. I wish Vista weren’t a disaster, I’m not going to defend it. But this doesn’t make you wiser for choosing Apple, not in 2011, when we have Windows 7 and are on the cusp of launching Windows 8. Ubuntu and Chromebooks have caught up as a viable options for any user who’s not producing media, playing games or doing power computing.
I don’t have an alternative for this one, really, it’s more just what I’ve observed Apple users to be like in-person, anecdotally. I just don’t want to be one of those “better than you” people, I want to bring as many people into computing and the internet as possible, and empower them along the way. I don’t want them to have to pay more money to be part of it, and I want them to be able to express themselves to the world on their own terms, not Apple’s.
Outrageous Cost Does NOT mean Quality
Not only does Apple find a way to charge you for everything because it’s all proprietary, but their computers cost a lot in the first place. Tremendously so – two to three times more expensive!
And no, this doesn’t mean their hardware is any better. I can buy a computer that’s nearly two times as fast as a mac for what someone would pay for even an entry-level mac. And no, Mac computers don’t have the best graphics. In fact now that they’re stocking the new Sandy Bridge chips their low-end models have Intel Express graphics, which are pretty poor “onboard” chips. AMD processors, while they might be slower overall, offer better integrated graphics. Apple also didn’t get SSD’s with TRIM support until Lion, placing them way behind in the storage race. Their ‘solid unibody construction’ is certainly no more durable than the equally-priced Lenovo (previously IBM) business travel Thinkpads or the military-grade Panasonic Toughbooks. Many people say Apple products are tougher and last longer, but I’m pretty convinced these days they have the same degree of planned obsolescence as everyone else – they want you to upgrade every two years and sign a new contract of some kind.
In other words Macs don’t have the fastest or highest quality hardware, and are often not at the cutting-edge of speed and technological development. They’re certainly not budget-buy machines, but they’re not the best either. They’re just priced high.
Apple Hardware Compatibility
Apple controls their hardware in many ways. They don’t want you to know the model number for your computer, so that they can prevent you from upgrading it on your own. This way you must take it into their store and pay them to do it. In fact even under “about this Mac” you won’t find model numbers, but instead a vague “early/mid/late 20xx.” They also like proprietary plugs. VGA adapter type has changed with virtually every generation of Mac, and this, combined with obscured model numbers, makes it very difficult to figure out how to buy the right one. Apple power adapters also only fit Apple computers. They are better (the magnetic thing is neato!) but this means you can’t easily borrow one from a friend unless they have a mac. And finally, it’s harder to swap in modular parts even if you do want to do it on your own. Many hard drives and videocards require special firmware for Mac.
Alternative: Apple, get with the program. Just make your hardware flexible, labeled and modular like everyone else. It’s not good to be different if different is incompatible or unnecessarily more costly. Oh, and can you release your OS for install on any computer hardware? That would be handy. And… like… Windows and Linux??
Virus? Coming soon to a Mac near you
While historically Macs have been a safe zone from viruses, I am confident this will change in the future. They aren’t as tempting as large corporate bot networks based (on windows) that can be used for spam and DOS attacks and the like, but they do have three factors going for them that could easily make them the up and coming go-to target:
- Users with money
- Users who are less digitally literate who purchased a Mac because it’s “the easy computer”
- Apple is increasingly “the man”
Yes, I am insinuating that many (nMac users are “rich people who are bad at computers” and this will make them targets for viruses with clever social engineering. This will likely spread to iOS as well.
Alternative: Uh, well, anti-virus for Mac is going to become more common…
The OS – Real Problems?
Clearly people get used to certain operating systems. They get annoyed switching because there are slight differences between them. Sometimes those differences do matter in a substantive way. For instance Exposé is really handy for managing many windows on a single screen, and is native only to OSX. By contrast the ability to truly get a full screen view of pictures in a folder and adjust them to be any preview size you like is only available in Windows 7 and Linux. I don’t honestly think these are big issues. Apple, like Microsoft, has made mistakes in charging their interaction models that defy previous norms (OSX Lion switching the scroll direction), but this isn’t all that big of a deal. The biggest complaint have I have about the OS is more of a complaint about people being disorganized but it’s worth noting because there’s a dimension of abstraction and learning.
The File System Display: Apple obscures the hierarchical relation of folders and files. In Windows you have a C-drive, subfolders and so on, and you could easily illustrate this structure as a tree. Mac has this too, but if you have a folder open in Finder you will just see yourself in a place like “documents” or “pictures” or “applications”. I find this causes users to do silly things, like downloading and unpacking an archive for a program into applications, installing it in applications, and leaving the install files there floating, to create a bloated mess. Besides this, what happens is users lose track of sorting files in an organized fashion. They learn to just use spotlight to find things they want and kind of just put stuff wherever, without having a good model in their head for knowing where it is. Don’t get me wrong – this happens on PC’s all of the time too – especially with Windows 7’s new “libraries” system or just dumping files on the desktop, but I think OSX encourages the behavior by always hiding the true location you’re in (the path or address bar) and abandoning the file address metaphor. I don’t think relational search and organization is bad, I just want to train users how to do both hierarchical and relational. I’ve met a lot of people (remember 5 years as a netech and several years as a technology education person, I’ve met hundreds of computers and their users) that can’t keep their files straight. They’ll have 10 gb videos laying around from 2 years ago in some strange directory that they’ve forgotten about. A gagillion programs installed that they don’t use. It’s sad and frustrating, and I think Windows does an ever-so-slightly better job of helping people to have an idea about how a computer is organized beneath the GUI.
Alternative: Include a file path in the top of finder windows, one that can be turned off via view options. Include the true path and extension of a file in the properties for the file. Really the better solution to this is to teach people to be more capable of effectively organizing information, something I’d argue is a part of digital literacy education.
Does Apple do anything well besides make money and control people?
Turns out yes.
iMovie.
The only low-cost easy-to-use video editing program that lets you map music to media via timeline, includes a hefty sum of excellent templates/effects and helps you to manage your assets with other integrated sound and photo editing programs. Check it out if you haven’t already. I even figured out how to do OSx86 and run OSX 10.6.8 on a Virtualbox install to try to get it… but sadly you need a true mac with video acceleration to run it.