Information is freely available on any topic in our society today, and much of it can easily be misconstrued and misused. The next major step in the digital age is to decide how to responsibly use that information.
Thomas Friedman made a point of declaring in the title of his recent three-time bestseller, The World is Flat, that everyone and everything is becoming much closer and more intimate in a globalized world. This has two effects. First, we are seemingly closer to everyone. There are few, if any, individuals in the world who are further than a click or dial away. Ironically though, we are also that much further from those very same people.
With the growth of technology that is necessary to become a flat world, there is also the growth of individualism, a state that has turned many people into more reclusive beings than they might otherwise be if they could not do everything from the comfort of their own keyboard. Everything in a Web 2.0 world is intertwined, making it possible for people to look up information on any topic in existence, or to add information on those same topics, from their home.
Libraries have long since been fading out of popular academic life, because of how useful the Internet can be. Academics subsequently make claims that the same rules that are thrusting the world into an increasingly digital age, are cheapening information and mitigating the role of the academic to webmasters. However, if you look at the results of the flattening landscape, there are a few things that have remained consistent―very few, if anyone, is qualified to make academic or professional decisions about their own lives. Even those few who are qualified make mistakes.
The WebMD Effect
As obvious as the example may seem, self-diagnosis over the Internet is a prime example of how free access to information has harmed some fields more than helped. While WebMD runs commercials about the thorough nature of its website and the ability of its patrons to diagnose themselves in seconds without paying for a doctor’s visit, thousands more are logged in right now, convincing themselves they have all the symptoms of a rare blood disease that needs to be treated in the next 24 hours, lest they die an agonizing death.
Yes, WebMD does offer an invaluable resource to millions of visitors every year who merely need to know if their cold is contagious or what kind of food they can have while taking antibiotics. But, when it comes to the prime reason most people visit the site―a random spot on their skin or a sharp pain in their side―the site’s wealth of knowledge can do more harm than good.
There is a reason that if you visit a medical forum, or email a doctor with a particular health problem, they almost always offer vague advice followed by an immediate recommendation to have the problem looked at by a doctor. They don’t want to alarm you, and they haven’t seen the problem. WebMD does the opposite, albeit by accident.
The Wiki Craze Gone Awry
I won’t write on and on about how Wikipedia and the free distribution of information is diluting the distribution of well-researched and accurate information. It has been done many times before, and by writers much better versed in how the technology works. However, in my own studies of late, I’ve noticed an increasing number of books citing Wikipedia as a credible source for information used.
This in itself is flawed logic. Wikipedia, as a source, is often correct, seen to by the millions who routinely edit and organize the information included there. However, it takes no time whatsoever to enter a page and change something to say what you want it to say. That is the entire purpose of a Wiki―it’s written by us. To quote such a volatile source, wherein facts are relative to who has been writing them, is a dangerous path to take.
I admit to reading and using Wikipedia consistently to prove and back up my own points, but I also try to check any information found through such means against a second source before actually claiming something is true.
The Dangers of Blind Faith in Information
As people become increasingly plugged into free information and self-diagnosis―whether medically or academically―the risk of complacency rises substantially. There is nothing wrong with directories full of information that anyone can check at any time. The concept is genius, and a great development in human culture. However, when those resources are not verifiable and do not offer professional surety, we need to retain a certain level of skepticism regarding everything we read.
Accepting that the massive brown spot on your arm is a mole without checking with a doctor is dangerous. Writing in a research paper that the 1913 World’s Fair was held Hoboken, New Jersey, is irresponsible (there was neither a World Fair ever held in Hoboken, nor an organized World’s Fair held in 1913).
For years, a major part of formal education was teaching children how to find and use information responsibly―through libraries, interviews, and academic sources. That same format continues today, but it needs to be changed. Instead, children need to be taught how to filter information properly.
A quick Google search will turn up nearly every fact and falsehood imaginable for a particular topic. Do we want children who are researching World War II to find a website denying the holocaust ever happened? The next major step in using and understanding the new age of informational freedom is in learning and teaching how to read that information responsibly.