Thursday, December 25, 2014

The Dying Hegemonies of Access

Here is an interesting pattern which we can use both to make sense of the past and predict what might be happening in the future. I call it the Dying Hegemonies of Access. For those of you who are not familiar with the word "Hegemony" it is defined by Merriam Webster as "the social, cultural, ideological, or economic influence exerted by a dominant group" It used to refer to the ruling families of Europe at the end of the 19th century. But, I am re-purposing the definition to describe a more modern phenomenon. I will begin, in this post, with a simple example. In future posts, I will elaborate on other examples. And I will use this pattern to make predictions beyond the more obvious ones of of today. 

The example I want to begin with is one that is very familiar to everyone - the local bookstore vs. the online bookstore. A couple decades ago, if you wanted to buy a book, you would have to go to your local bookstore. Granted there were some exceptions. There were some used book stores. And some books were available through catalogs. But, the majority of books bought by individuals were acquired through local bookstores. 

We can think of these local bookstores, along with their distributors, as a hegemony of access. That is, they were a social institution that controlled access to reading material. I don't mean this as a criticism. In fact, if it weren't for local bookstores, back then, most people would have almost no access to reading materials. 

But, over time this hegemony of access, like the ruling families or Europe, began to crumble. First, there were discount book sellers who provided less costly access to high volume bestsellers. Then there were large bookstores that provided much greater variety. I can remember when the first Border's Bookstore came to town. Suddenly, I had access to a variety of books that I would have had to live downtown in a major city to get access to previously. Then, of course, the web showed up and Amazon.com provided everyone with access to everything. Eventually, the hegemony of access that was the local bookstore crumbled.

This is a patten we have seen over and over again. I will lay out examples in future posts. It is also a pattern we can use to anticipate what might happen in the future as well. I will provide some examples there too. I would also point out that not every local bookseller is gone. Barnes and Nobel is still around and still very solid. They adapted to the shift in access and there is a lesson to be learned there as well. 

Friday, December 19, 2014

Change is in the air

I have been writing a series of posts in one of my other blogs ( Ranting and Reflecting  ) about how we are approaching a period of change. While these posts focus on a shift from "Normal" times to "Chaotic" times, I also explain this as a recognizable pattern. I then go on a bit about how to recognize and validate patterns. If you are interested in patterns, you may want to have a look.

Saturday, April 26, 2014

What Is So Wrong With The Way We Do Things Today?

The current state of white collar work is not sustainable. We waste time in traffic driving to work and create pollution problems with our cars. You hear a lot of people talk about alternative energies due to the pollution caused by oil in general and cars in particular.  But, you don't hear anyone suggesting that we just drive a lot less by giving up this antiquated notion of driving to work so you can be where the action is. Certainly working in virtual worlds provides an alternative to driving to work and that alone might be enough incentive to change the way we do things. But there are more reasons.

Today most people in white collar work do not know what they actually do. Through some sort of indoctrination they learned the rituals of their employing entity. But, how are those rituals connected to the value created by the entity and how are the efforts of individuals connected to that aggregate value? Nobody knows. It will be hard to explain to our grand kids that back in the beginning of the 21st century we got paid for showing up rather than for productivity and that showing up had serious costs in terms of both productivity losses and pollution problems.

We might also try to explain how we did not always have the best person doing each job and people did not enjoy their jobs nearly as much as they should. Or maybe we will just be so embarrassed by how things are done today that we might decide not to even bring it up.

I am going to take a break for a while from writing this blog. I have a seriously difficult philosophical paper to write and I need to concentrate on it. I will return when I am rested, recovered, and full of exciting new ideas once again.





Friday, April 18, 2014

Adam Smith and the Pin Factory

One final thought before I bring this all together. In 1776 Adam Smith published his landmark book entitled The Wealth of Nations. It was a landmark book because it introduced the field of economics. In it, Smith gave an example of good economic behavior in an example that is now know widely at the pin factory example.

In the pin factory example, Smith argues that economic prosperity comes from specialization. In the pin factory, he explains, there should be one person for each task. One to cut the wire. One to sharpen it. On to put on the pin heads. And one to stick the pins in the paper rolls in which they were sold.

This idea of specialization caught on in many professions over the next two centuries increasing their effectiveness and economic productivity. For example, you don't go to divorce lawyer to do your taxes, and you don't go to a cardiologist when you have a stabbing pain in your stomach. There are endless examples in other areas such as engineering where we have civil, mechanical and many other specialties and academia where faculty are often defined by their departments regardless of frequent calls for interdisciplinary research. The effect of specialization even reaches into creative professions such as creating comics or films. For example, it is rare that someone writes and draws their own comics. And even among artists there are many specialties.

But, what about white collar work? What if you work for a government agency or a large corporation? Chance are that being a generalist it considered preferable to being a specialist. And that is one of the many reasons why things don't work very well in the government and large corporations. There are exceptions of course. But, it is fair to say that the average white collar worker does not really know what they do and, if asked what they do will resort to an answer that indicates who they work for rather than what they actually do. And this, is a problem.

Friday, April 4, 2014

A Penny for a Spool of Thread, a Penny for a Needle...

We often believe that the way we see the world today is pretty much how people have always seen the world. If people in the past didn't see things the way we see things today then they would have if they really thought about them. This is, of course, untrue. People in the past have believed a wide variety of different things. And it is useful to peek into one of these past moments to put our current beliefs and worldview into perspective.

As the industrial age went into full swing in Victorian England, a few interesting things occurred which shed light on our current worldview and future economy.The first relates to the previous post about the clock and capitalism. As farmers moved into cities and began working in factories, one of the problems that factory bosses had was getting the workers to show up on time. Back in those days people were used to getting up with the sun (or rooster) which occurred at a slightly different time each day. Having them show up for work at the same time every morning felt silly and unnatural. If you told them that in the future people's activities would be controlled by digital timekeepers they probably wouldn't believe it. And many things that I tell you about the future are just as hard for you to believe.

A second problem that occurred in the factories was that people smelled very bad. Back in those days people would not bath on a daily basis and might go for weeks or months without a bath. Bathing was considered unhealthy. And, in the confined spaces of a factory, it could get pretty ripe. Bathing did not become a regular practice until a doctor posited the germ theory of illness saying that common illnesses such as colds were caused by germs and bathing could reduce the germs. Today, we fully accept germ theory and the hygiene that goes along with it. But, in Victorian England it was a tough sell.

It is interesting to speculate what things do we believe today that are untrue and the truth of which would lead us to behave very differently. In Woody Allen's movie The Sleeper a man is put into suspended animation and wakes up in a future where cigarettes and bacon are good for you. I don't care about the cigarettes. But I long for a world where bacon is a health food and the more scotch you drink the longer you live. Ah, if it were only so.

The final point has to do with productivity. Victorian coal miners got paid for the amount or coal they brought out of the mines and did not get paid if they were not productive. Since life was good when a vein was found and not so good when they were trying to locate one, the miners and their families got used to cycles of abundance and deprivation. These cycles are recorded in a well know nursery rhyme -  Pop Goes the Weasel. Pop was the slang word for pawn and the weasel was slang for one's Sunday best clothing. When times got tough and money ran short, families would have to pawn their best clothing in order to get by until times got better. We can see this in the words of this catchy nursery rhyme:

A penny for a spool of thread,
A penny for a needle,
That's the way the money goes
Pop goes the Weasel.

If you were to tell Victorian miners that in the future people will get paid for showing up, they simply wouldn't believe that. If you tell people today that in the future people will be paid based on productivity, they don't believe that any easier.

Friday, March 28, 2014

Did the Clock Cause Capitalism?

There is an argument put forth by philosophers of technology that asks the improbable question - did the clock cause capitalism? The motivation, I believe, is to show how seemingly benign technologies can have profound and far reaching impacts on our lives. Yet, as unlikely as it seems that there is any connection at all between advances in timekeeping technology and the rise of a new economic model, the argument that they offer is difficult to dismiss. I will sketch it our here and leave it to those more diligent than I to look up references. You can find the basics of this argument, if you are so inclined,  in the works of Lewis Mumford.

In the Middle Ages timekeeping devices such as the tower clocks in monasteries were improved  as the residents of monasteries wished to keep a regular schedule and perform their devotions at the right time. The villages around the monasteries could hear the bells tolling as the hours passed giving the residents a growing sense that time was, indeed, a real thing and that it could be accurately measured.

The measurement of time led to another new idea that one could not only measure time but measure the amount of work done in a unit of time. In earlier days, when time could not be accurately measured this idea would have been preposterous. Today we would call this notion of work done per unit of time productivity. And it didn't take long before people began to think that producers should be rewarded from productivity. The most productive should be rewarded and move to the top. The less productive should receive less rewards and move to the bottom. It is a small step from this fundamental notion to the tenets of capitalism.

A secondary effect of these more precise timepieces was an advance in instrumental realism. The clocks were made to measure a phenomenon, that is - time. But after a while the clocks themselves became more real than the phenomenon they were measuring. If you ask anyone today what a day is, they will say 12 hours. But, the days nature creates vary greatly and can best be described as the time between sun up and sun down. Today we view those widely varying and imperfect days created by nature as secondary, whereas the perfect days created by our time keeping devices are the real ones.

While on this thread of unexpected effects of technology, one could argue that the telephone caused skyscrapers. The gist of this argument is that considering the number of messages going in and out of offices in buildings, it would not be practical to have buildings more than a dozen or so stories high if messages had to be carried in and out by couriers. But, allowing electronic transmission of messages removes that bottleneck and allows for the tall buildings that we have today.

One has to ask, did the clock cause capitalism or did it allow capitalism. Similarly one could ask die the telephone causes skyscrapers or did it allow them.  I would lean toward allowing rather than causing.  But, semantics aside, the point is that seemingly benign and unrelated technologies can have a major impact on our lives and our worldview.

Friday, March 21, 2014

Playing to Your Strengths

Marcus Buckingham (with some co-authors) has made some impressive contributions to positive psychology with a series of books First Break all the RulesNow, Discover Your Strengths , and Go Put Your Strengths to Work as well as other books and other media. I refer to this as the StrengthsFinder movement as a convenient shorthand because StrengthsFinder is the name of a book, a test, and a website derived from the research. I would encourage people to read these books as my very brief synopsis will not do justice to the richness nor importance of the ideas coming out of this movement.

Nonetheless, Buckingham asserts that the dominate model of performance review used in most organizations today is flawed. Typically, a employee will talk to his or her manager who will identify the employee's weaknesses and encourage said employee to work on those weaknesses over the course of the next year. This, according to Buckingham, will, at best, lead to adequate organizations but not high performance organizations. Each person has strengths which need to be identified, developed, and used to the advantage of both the employee and the organization. And, rather than work on one's weaknesses, one should work on one's strengths.

There is more justification for this than I have room to go into here. But, a simple example can provide a plausibility argument. On a typical football team, the quarterback excels at passing, runs adequately, and it likely to be rather poor at blocking or tackling. If the football team were run like a typical organization, the quarterback would be called into his annual review and the weaknesses in blocking would be pointed out. The quarterback would then spend the next year trying to raise the quality of his blocking. This is silly on the face of it. But, it is how modern organizations run. In order for an organization to be high performance, individuals must figure out what they are good at (we call these strengths) and find out ways to employ their strengths in the pursuit of organizational goals.

I have taken the test and my first strength is Futuristic. I am good at seeing the future. If you have been reading this blog for a while, you know that it is all about predicting the future. There are, as one might expect, also things that I am not good at. For example, I am not particularly good at schmoozing other people and making them feel comfortable. Nor do I particularly care. One of my other blogs DrArtz-RantingAndReflecting provides ample evidence of these weaknesses.  But, when I am thinking about the future it is effortless and sublime.

Unfortunately, we tend to hide our strengths to avoid the criticisms of other. This is a carryover of industrial age thinking where conformity is valued more than uniqueness. If I am good at predicting the future and others are not then there must be something wrong with them or wrong with me. Usually, people see this as something wrong with me. They get tired and annoyed when I talk about the future. But this isn't about me. It is about our inability to accept strengths and uniqueness in others. We don't like people who can effortlessly talk to strangers at parties. We don't like people who can remember names. People who strive to keep things on an even keel are just annoying. And those who care deeply about others, well, there just aren't words to describe the contempt that feel for them. So, when people are good at things, others, who are not good at those things like to downplay their importance.

But, we can't avoid the fact that when people are using their strengths they are more productive and more satisfied. So, while this reorientation that Buckingham is talking about may take some time, it will definitely happen. And, all of the pieces I have been bringing together about the nature of work will help that happen.

Friday, March 14, 2014

Wikinomics, Free Agency and the End of Tribal Corporations

A few years ago I read a very thought provoking book called Wikinomics: How Mass Collaboration Changes Everything   by Don Tapscott and Anthony D. Williams. I highly recommend it. It makes some very compelling argument for a new economic model in which people who work for corporations are slowly being replaced by free agents who compete for the work over the web. There is more to it than that but it sparked my thinking so I wanted to give it credit for the spark. About a year or so later I read another book Free Agent Nation: The Future of Working for Yourself by Daniel Pink which further solidified my thinking in this area by revealing the extent to which free agents already participate in the job market. After extensive contemplation, I came to some conclusions about how this might play out. The following is my vision of the future of work. I am moving cautiously here because I want to give credit where credit is due while not attributing any of my crazy speculations to the sources of inspiration.

Although there are some very significant counter examples in the world today, most citizens of modern, developed, autonomous nations have long ago for gotten their tribal origins. I would draw an analogy between tribal allegiances and corporate allegiances.  That is to say that there was a time when tribal allegiances were a necessity for the workings of society and the protection of individuals. Similarly, there was a time, when corporate allegiances were necessary for the workings of the economy and the protection of individuals. Except for some significant counter examples in today's world, tribal allegiances have all but faded into history. And it won't be long before corporate allegences will fade as well. Imagine someone today asking you what tribe you belong to. For most people this question would sound silly. In the future the similar question - what company do you work for - will sound equally as silly. When somebody asks - who do you work for - you will answer "I work for myself, just like everybody else does."

Why do we work for corporations or government agencies today? I think there are two primary reasons. First, working for an entity gives you a place to go to meet with your co-workers. And, second, inherent in the management of the entity is a control structure which directs work and manages productivity. But, neither of these features is part of the fabric of reality and both can change as our reality changes.

The first is easy. Telecommunications has made co-location irrelevant as I argued in the Visualization of YOU.  The second takes a little more explanation. Think about tribal structures which had control structures as well. There was probably a chief, some warriors, maybe a shaman, and the like. Those job were replaced by governments (local and federal) and professions (doctors, soldiers, teachers, etc). It is possible, maybe even likely, that corporate roles will follow the same path. In other words, managers and executives will be replaced by similar roles that are not limited to a given entity, possibly an industry or even a nation instead.

"What will happen to the workers?", you might ask. My guess is that they will all become free agents, bidding for pieces of work that are tendered on the Internet. So, when work needs to be done, industry executives will dole it out online, individuals will bid on it, complete their tasks and get paid for the work they did by the industry entity. Presumably, if one believes in an efficient economy, workers will bid on the work that they do the best as to get maximum return for their efforts. From the industry side, worker will be chosen based on their efficiency at completing tasks as desired. One side effect of this is a maximally efficient economy as everyone does exactly what they do best and each job goes to the best worker. There is more to it than that as this must occur along with other trends. But, I don't want to get too far ahead of myself.But, if the analogy holds, tribal members eventually became free citizens and corporate workers will become free agents.






Friday, March 7, 2014

Virtual Worlds and the Virtualization of YOU

I would like to begin with an analogy as analogies are one of the patterns one uses to get some purchase on the future. And I am going to use this analogy to make two key points.

In the early days of the World Wide Web, the economics of the web made it not only a viable alternative to business as usual but a serious threat to business as usual. There are many, many examples. Companies who sent out brochures to customers would spend several dollars per customer. Allowing customers to access the same information on a web sites dropped that marginal cost to pennies. If you sent out catalogs, the savings were even more dramatic. Software distributor could distribute software more cheaply. Customer service became much easier as customer could access the website for information about products. If products required some training, videos could be posted. The economics of the web made it silly to do things the old way.

The second point in this analogy is that when we step back and look at the web from a more abstract level we can see that the web is a large interactive document that made the location of information irrelevant. It used to be that you had to know where to find stuff. And locating information often meant a trip to the library, courthouse, agency or business where the information was located.

Applying the first part of this analogy to virtual worlds we can see enormous economic benefits of using virtual worlds. Several years ago I attended a professional conference in Second Life. I got up that morning, made a cup of coffee, logged in to Second Life and attended the conference. Had I attended the conference for real I would have lost a day traveling to the conference and a day traveling back. Further, if there were dead spots in the conference during which no papers were being presented that I cared about, my time was dead as well. At home, I could just go about my regular business if there were lulls. In addition, to my time there were other expenses. I would have had to pay for airport parking, airfare, ground transportation to and from the hotel, room rental fees, and meals. This could easily add up to thousands of dollar whereas the virtual conference cost me next to nothing. This is one example where the economies of virtual worlds parallel the economies of web technologies.

But, this example is far from the only one. One can use virtual worlds for meetings, tourism, and education with similar economies. I will elaborate on the tourism example. Let's say I wanted to spend a week vacationing in Ireland. Using the old approach, I would fly to Ireland, spend two or three days learning how to get around, two or three days enjoying the country, then a day or two winding down. Once I arrived back home I may realize where I would have liked to have gone but didn't. And I might remember something I decided not to buy but now wish that I did.

Let's look at the same vacation with the benefit of virtual tourism. First, I might visit virtual Ireland several times during the weeks or months leading up to the trip. I am doing this at my leisure and from the comfort of my home. But, during that time I become more acquainted with country an what I want to do there. Then I take the real trip. Since I have already familiarized myself with the country before arriving, I don't have a period of adjustment and I don't make the wrong decisions about what to do. Then when I return home and realized I should have bought those Irish linen sheet, I can just go back into virtual Ireland, find the store and order the sheets.  Further there are other places that I might want to see virtually but not actually travel to. So, the virtual world offers touring options that don't even exist today.

On the second point, that of virutalization, I should mention that the web was merely a step in a series of virtualizations where telecommunications technology made the location of things irrelevant. Early steps included such things as coaxial cables which made the location of the computer within a building irrelevant, and on through local area and then wide area networks that made the geographic location of a computer irrelevant. The web and the irrelevance of document location was an important step. And now, with cloud computing, we have make the location of most things irrelevant. Most things, that is, except you. But that is about to change. Virtual worlds will make your location irrelevant. By logging into a virtual world you can be present virtually anywhere. So, not only are computing resources and information virtualized, but you are now virtualized as well.

Friday, February 28, 2014

Potential Impacts of Data Warehousing

In 1776 Adam Smith published his classic book The Wealth of Nations which introduced a new way of thinking that would eventually become economics. In this book, he introduced, among many other things, the idea of the specialization of labor. Specialization of labor allows one to focus on one task and perfect it. 

In 1912 Frederick Taylor published a landmark book entitled The Principles of Scientific Management. In it he laid out the principles that ultimately transformed craftsmanship into manufacturing. Over the course of the 20th century, manufacturing became more refined. Taylor added to Smith's idea of specialization of labor the ideas of idealizing task structures and measuring outcomes.  It ultimately led to a revolution in manufacturing.

This transformation in manufacturing lead to most of the products that we take for granted today. For example, if we were still in the craftsmen days most products including cars, computers, dvd players, wireless internet connections, blenders, electric can openers, etc. etc. would be way too expensive for the average consumer if they could be made at all.

That revolution never can to white collar work. There is very little specialization white collar work. I should mention for the sake of clarity that white collar work is different from professional work such as law or medicine where specialization is more common. White collar work is what happens in the offices that make up private corporations and government agencies. Nor only do people who do white collar work rarely specialize, the pressures are often in the opposite direction. You must be flexible, a team player, or one who can pitch in and do what is needed.

Most people who do white collar work are unclear about what they really do. If you ask a white collar worker what they do they will probably mention the company or agency that they work for or attempt to explain what they do by stating their title.

Enter data warehousing and the world begins to change. To put this into perspective, consider the impact of the relational data model over the past few decades. Relational databases are inherently categorical and their implementation has led us to think of the world in terms of categories. Data warehousing, on the other hand, Is process oriented; measurable processes to be specific. We model business processes in a data warehouse with the intent of improving them. We routinely think of what categories we fall into - employee, customer, voter, etc. Soon we will think of our work as consisting of measurable processes. This, in effect applies Taylor's ideas to white collar work.

If this catches on it will transform the way business operates. Everybody will know exactly what to do in order to be productive. And, if this all works out, people will actually be paid for being productive. Too often in today's world people get paid for showing up or gaming the system in various ways. There are many implications of this which I will save for another day so I can get to the point which I am trying to get to.

An epiphenomenon is that people will be able to work remotely as productivity will be the only factor. So, where you show up at the office or not will no longer matter as long as the work gets done. And since you only get paid for the work done nobody will really care how or where you do it.

I said 'if' earlier because there are competing views of the data warehouse. The competing view, and the more traditional view, is that the data warehouse is merely a large historical store of data. Which way will things go? It is hard to tell. But, in this question is a teaching point about predicting the future. The future is anything but determined. But, we can establish contingencies and say if A occurs then here are the likely consequences. If B occurs something else will likely happen.

Friday, February 21, 2014

The Changing Nature of Work

One of my more outrageous claims about the future is the changing nature of work. In this post I will lay out some ground work for the claims. In the next I will explore implications. And in the following one I will return to the current theme and look at the nature of work in the rear view mirror.

Let's begin with another backward looking technique I introduced earlier. Imagine it is fifty years into the future. Your grand kids are sitting on your lap asking what life was like back in the early part of the 21 century.

"Well, one thing that was very different," you begin, "is that everyone got up in the morning and drove an hour or so to get to work. Most of the drive was in dense traffic which is why it took so long."

Your grand kids squirm and giggle and ask "why did everyone do that?"

"Well," you reply, "you had to be in the same building as everyone else in order to do your work."

"That's silly," they reply. "Why don't you just talk to them with your computer like everyone else?"

You start to reply but you don't have time. They slide off your lap giggling, thinking that you are teasing them. They go on to the next thing that grabs their attention as you ponder how very different things are now.

Why are they so different? There are several forces that have come together to create a massive change. As I have mentioned before, one way to detect likely changes is when there is a convergence of forces. In this case I would single out, but not not limit it to, five factors: 1) data warehousing; 2) virtual worlds; 3) wikinomics; 4) positive psychology; and 5) all of the usual problems associated with cars, traffic, fossil fuels and air pollution.

First lets take on data warehousing.There are competing views on data warehousing but I am a strong advocate of the view presented by Ralph Kimball. In this view, the data warehouse models measurable business processes. To cut to the chase here, modeling measurable business processes allows us to improve them and forces a discipline on white collar work that was previously only achievable in manufacturing. Certainly one of the implications of this is that people in white collar jobs might spend more time working and less time on  other silly business around the office such as meetings and sucking up to the boss. Another implication might be that people actually get paid for the work they do rather than paying everyone around the same amount regardless of the disparities in productivity.

Second, virtual worlds will make location irrelevant. In the same way that web technologies made the location of documents irrelevant, virtual worlds will make your location irrelevant. So, instead of hopping in a car and going some where, you can just log into a virtual world and meet with others regardless of where they are located.

Third, wikinomics, or the study of mass collaboration, suggests that in the future we will all become independent contractors working in collaboration with each other in dynamic teams rather than having to work for a company and show up at a building to work with others.

Fourth, positive psychology suggests that work is most productive and most satisfying when you are working at what you do best. Wikinomics will allow this, and economic forces will drive it.

Finally, our current approach with way too many cars, way too much driving, way too much wasted time, and way too much pollution is not sustainable. We have to do something. So these changes will come about through a combination of need and possibility.

How is this all going to work? We, I admit that what I have provided is a bit sketchy. So, stay tuned as I explain it in more detail.



Friday, February 14, 2014

Looking at Education in the Rear View Mirror

Following on the thread that I brought up in the last note, we can apply this idea of looking at technological change in the rear view mirror to education. Currently, there is a lot of interest in distance education, which, I believe will do to the education industry what the web did to the newspaper industry. Yet, as always, there are those who refuse to believe that this will happen. And they have their talking points, an earmark of futile resistance that I have discussed before. Most of those talking points center around the virtues of face to face interaction. If you have been reading this blog for a while, you might see some parallels between the arguments for face to face interaction as arguments against virtual worlds and the arguments for face to face interaction as arguments against distance education.  They are basically the same and can be summarized as "what I am used to is better".

Nonetheless, I should give them a fair airing. Distance education skeptics will point to the fact that in the classroom you can see the students faces and they can see your face. Since you can see their face, you can tell if the are confused by what you have said and try explaining it a different way until the light goes on. Since they can see your face,  it gives you another channel of communication by which you can emphasize points or provide nuance. One could also argue that being in a classroom provides a sense of group cohesion and makes learning a social experience as well as an intellectual experience.

I actually agree with all these points and utilize them routinely in my face to face classes. But, I would also raise three questions. First, how uniformly and effectively are they applied? Second, are their other ways in which the same thing could be achieved? And, third, does distance education provide any benefits that the current approach does not?

On the first point of uniformity and effectiveness, I would point out that teachers are not uniform in their ability to read their audience. Some miss audience feedback entirely. Some misread it. Some are simply not interested in it. While there are, admittedly, some teachers who really connect with the audience, this is far from universal. So, holding it up as an essential feature of today's education system is a bit like attributing the surge in the number of baby boomer to drive-in movies. It certainly is a factor, but probably less common than one might think.

On the second point, we are assume that audience feedback can only be achieved in face to face interaction. Not only is this not true, it could be argued that real time feedback is inhibited by face to face interaction. Some students are shy. Others do not want to appear stupid by asking a question. Some do not want to appear interested as dispassionate detachment is often seen among college students as a virtue. Technologically mediated interaction can even be preferable as we have seen in Group Decision Support Systems.

On the third point, I could go on at length, but will just summarize a few points to spare the reader. First, in distance mode students can view their lessons at convenient times, and review them as needed. Their is less variation in quality between classes as the classes can be refined as needed. In today's environment there is a tremendous variation between teachers and ever variation from one class to another with the same teacher. Finally, distance education allows us to adjust to learning styles more effectively and utilize multimedia and advanced technologies more effectively. So, face to face education is far from the only approach and, arguably not even the best.

Now, let's look in the rear view mirror. Imagine a scenario a hundred years into the future. All education is done online. Applying the manufacturing model of constant refinement, the classes are nearly perfect after a century of accumulated incremental improvement. Students are effectively educated in all areas where education is needed from basic job skills, to life skills, to the intellectual skills needed for good citizenship. And, then... something happens. The Internet goes dark. The reason for the Internet going dark is not important. But, if you are one of those people who cannot get past certain details, let's say it was a solar flare or a comet passing too close the earth creating electrical disturbances.

With the Internet dark for the foreseeable future we have to figure out a way to continue educating the population. Somebody suggests going back to the way it used to be done. We can put them in a room, thirty people at a time, at an assigned time, and have somebody stand in front of the room and tell them what they need to know. How well do you think that would work? My guess is - not very well. We do things the way we do them today, not because it is the best way to do them, but because we are making the best of a bad situation. Trying to make that bad situation noble is nothing more that a lame attempt to hang on to what you are used to.

Saturday, February 8, 2014

Looking at Changes in the Rear View Mirror

For reasons that I do not fully understand, people resist the notion that things will change in the future. This is odd because if you were to ask someone if they thought the world would be static for the remainder of their life, they would dismiss the question as being silly. If you ask what they think will change, they will probably offer ideas that are not really changes, rather extensions of what is already going on. I think their is some interesting psychological research in this idea. Why do people resist the idea of change. What kinds of changes are acceptable. And so on. But, the problem is - how do you get people to allow for the possibility of change when those changes are not ones that they would readily accept?

A technique that I use, which seems to be somewhat successful is what I call looking at change in the rear view mirror. Instead of have a person evaluate the likelihood of changes in the future, have them imagine being in the future looking back. I have introduced this technique before when I told about having students imagine they were grandparents talking to their grand kids about changes that occurred in their past in the scenario, but in their futures for real.   

Here is a simple example. There has been some talk on the television about cars that park themselves and brake themselves if a car in front has braked. These are both fairly safe examples so their is not much resistance. But, if you roll things ahead a bit and talk about self driving cars where the person merely goes for a ride and has no control over the car people start getting nervous. I think it is the lack of control and the belief that these cars will not really work so the person still needs to be able over ride the car's autopilot. This is actually silly when you think about it. This would be like having an over ride switch on your anti-lock brakes that would allow you to eliminate their interference when you slam on the brakes.

To put this into perspective, imagine a future as far out as you need to in order to accept the reality of self driving cars. Imagine that everyone uses self driving cars and that they have been around for decades virtually eliminating traffic accidents while conserving fuel and allowing the riders to do other productive tasks. Then imagine some catastrophe that makes self driving no longer viable. Perhaps the Internet is destroyed, or sunspots knock out GPS, or some sort of virus has destroyed the driving chips. What ever the reason, cars can no longer drive themselves and people have to switch to manual driving mode in order to get around. What kind of chaos would that bring?

We tend to see the ways things are as the best way that things could be rather than seeing the way things are as merely making the best of a bad situation. When we look at it in the rear view mirror, things look a little different.

Saturday, February 1, 2014

Vegans of the Mind

In the period of time just after World War II food was plentiful and nobody would think twice about having eggs and sausage for breakfast, a cheeseburger with fries for lunch, and a steak with a baked potato stuffed with butter and sour cream for dinner. People did not think very much about the relationship between food and health. After all, there were a lot of starving people in the world. So, anyone who had plenty of food must be well off, and, as a derivative, healthy. This model of excessive consumption does not even address the vast number of cigarettes that were smoked, at the time, nor the rivers of alcohol that were consumed. And nobody thought of this as anything but healthy. To be fair, I suppose I should as that nobody thought of it as unhealthy, which is a little different.

But, as we moved into the 1960's we began to see some evidence that what you put into your body in the way of food did, indeed, have an impact on your health. In the decades since then we have turned that around as we become ever more compulsive about eating healthy. You could say that we have gone a little overboard with our obsessions with second hand smoke, sugar free drinks, starvation diets, and all manner of questionable supplements. I think in the future when, all this sorts out, we will looks at the food obsessions we have today like obsessions in the past where women stuffed themselves into unbearably tight corsets in order to have the proper figure and men dripped with sweat wearing hot suits to look professional and proper. But, I will leave that there as my point for this piece is something entirely different.

We are on a new exploration of good health that is analogous to the previous one, only we are now beginning to understand that well being and mental health are, indeed, realted to the things you put into your mind. Many of our beliefs and values are the cognitive equivalent of junk food. Salt and fat taste good so we eat it. Ridiculous ideas feel good so we adopt them.I am not going to mention any specific ideas that constitute ridiculous thinking as that would invite readers to defend ridiculous beliefs missing the point of this post. But, I will say that there have been amazing advances in the past couple decades in cognitive science, positive psychology, and the functioning of the brain which address happiness, well being, optimal living, will power, self delusion and a host of other related topics. As these studies advance our understanding, ridiculous, dysfunctional and counter productive ideas will begin to recede.

Friday, January 24, 2014

Predictive Shipping

Several years ago, I started one of my "In the Future" rants claiming that in the future you won't have to go online to order things from Amazon. They would just figure out what you need and send it to you. This would irritate students as I went on further to say that this is a good thing as they should not make their own decisions. Since they make horrible decisions, they should let algorithms make decisions for them.

Now, I have to admit, in all fairness, that I love annoying students with predictions about the future presented in such a way that is constructed to draw arguments. Then, as the arguments arise, I dismantle them. I do this for pedagogical reasons as one cannot think clearly about the present or the future if they can't get past their own cognitive biases. And there are always biases about future possibilities. But, pedagogy aside, there are some good reasons for these outrageous claims.

First, you have to ask - why would you interfere with a decision that a machine makes if your intervention is likely to produce a lessor, wrong, or even disastrous result. Consider the computer in your automobile that decides to change the gas/air mixture in your fuel injector or buffer the pressure you apply to your anti-lock brakes.Would you rather have a little window pop up on the dashboard saying "I'm going to apply brake pressure more evenly to keep you from going into a skid. Is this OK? Press Yes or No". That wouldn't do at all. Clearly, you want the computer to decide for you because your intervention can do nothing but produce an inferior result.

Second, there is abundant evidence that we don't make very good decisions.You might dismiss the previous example by saying that computer control of your car is very different from computer control of your shopping. To this I would ask - do you make good shopping decisions? Do you have any books you bought and never read? Any clothes you haven't worn, movies you never watched, cans of food you keep pushing to the back of the pantry and so on. Have you ever been to a restaurant that you didn't like, taken a job that didn't work out, or go on a date that turned out to be a nightmare. The truth is that you don't make very good decisions and allowing algorithms to decide for you might very well improve your quality of life.

Third, in this matter, we fall prey to erroneous thinking that has already been introduced in this blog. And that is evaluating a future technology in terms of the present rather that the future that the new technology brings about. We like to make our own decisions because we feel that decisions made for us will not be made correctly. We also think that we need to make out own decisions as we can learn form our mistakes. But, this reasoning is based in the present where decisions made for us might not be the best decisions. We feel that being able to make our own decisions is in our best interests. But is it. I will give two example where delegated decisions have proven superior.

When I was a kid, most people did work on their own cars. While you could always find a gas station that would be willing to change the oil, filter, or spark plugs, fill the tires with air, or set the timing, a lot of people were unwilling to relinquish the control over their car's engine. As engines and their computers became more sophisticated it became increasing more difficult to do your own work and more difficult yet to do it right. Now, you find very few people who do their own work. After all, the experts do it right, and affordably, so why would anyone get their hands dirty or oil spots on their driveway. 

If the car example does sell you then I would ask - do you own any mutual funds? Most people who have retirement accounts have the bulk of their money in mutual funds. They don't pick their own stocks. And, in most cases, if they did try to pick their own stocks they would just mess up their portfolios.  Once we realize that fund managers can do a better job and do it in an affordable fashion, why would anyone make their own investments in stocks. Granted there are a lot of people who still own stocks and I am one of them. But, we do it because we still think we can do better. But this will change over time. Once my stock portfolio takes a bad enough beating while my mutual funds are cruising along nicely, I will probably give in. And with Index Funds and Funds of Funds this day is approaching rapidly.

The point here is that we want to maintain control as long as we think we can do better. Once we realize we can't we are willing to relinquish control.

Sorry, I have to go. The doorbell rang. I hope Amazon is sending me something.

Friday, January 17, 2014

Will the World Wide Web Catch On?

Will the World Wide Web ever catch on? Today this question looks ridiculous. But, at one point in time the question was very serious. And that moment in time provides us with a revealing story about technology and its acceptance.

The World Wide Web was beginning to capture of attention of IT people in the early to middle 1990's. In order to take advantage of this emerging technology I developed an experimental course entitled "Corporate Web Applications". Most people, at the time, found this to be greatly amusing as the idea of using this glitzy technology for serious business applications was beyond the pale. Even to techies this idea was questionable as most websites were the very definition of poor taste with a garish montage of poorly chosen colors. And, of course, there was the outstanding technical problem of asynchronous database interaction. It did not look promising.

But, the larger barriers were not technical or graphic. They were the usual resistance to new ideas that people have when considering a new technology. As I mentioned earlier, you can always recognize this resistance to the new as people will resort to a standard set of talking points. I told my classes that in the future you would buy things over the Web. It would become the preferred place to shop and the first place you would go for information. But, as obvious as these predictions are in hindsight, they were anything but obvious at the time.

How can you buy clothes that you can't try on? How can you buy a book without leafing through it first? How can you buy any product that you cannot first touch. There seemed to be a prevailing belief that you some how needed to shake a box before you could buy a product. There were visions of clothes that didn't fit; products that fell apart as soon as you opened the box; and vendors who would not return emails, let alone products. And if that were enough, how could you give your credit card number to somebody at a strange website. This conjured up visions of some social misfit with a sleeveless shirt, spiked hair, tattoos, piecing, and endless chains hanging off of anything you could hook a chain to who had created this site for no other reason than to steal your credit card number. No that wouldn't do at all.

I recall a group project one semester where the students were illustrating the perils of shopping over the web. They began with a skit. A well built young man walks into the class wearing a t shirt that was clearly two or three sizes too small.

A woman in the skit, presumably playing his wife, says "Where did you get that shirt? It is way too small". 

He answers, "I bought it over the web and didn't have a chance to try it on".

She, then, responds, "I told you not to buy things over the web. You got exactly what you deserve"

The students in the audience nodded gravely in agreement. Yes, if you buy something over the web, you will get what you deserve.

Now, to put things into perspective, consider the following two statements:

1) You cannot touch the people in Second Life and you do not know who they really are.
2) You cannot try on clothes you buy on the web and you don't know who you are buying them from.

These are both talking points about emerging technologies. They are both true. But as we find out over time neither of them matter. They are both examples of how we evaluate a new technology in terms of the way the world is today rather than in terms of the world that they technology helps to create.



Friday, January 10, 2014

OK, What About Sex?



Sex is the big enchilada of the arguments against Virtual Worlds. People can imagine going to work or attending classes in a Virtual Worlds even if they don’t like the idea. But, sex is the show stopper. You can’t have sex in Virtual Worlds; at least not real sex. And if you can’t have real sex how can you procreate?  Resistance to Virtual Worlds is firm on this point. And people who raise this argument see it as the grand mal seizure of technological musings, throwing the possibilities of future Virtual Worlds into a tail spin of chaos. They see it as the one irrefutable argument against living in Virtual Worlds. But, it isn’t. And I will take on this argument now.

First, I would like to provide an analogy to put things into perspective. In the Middle Ages in Western Europe, towns were beginning to emerge and people were beginning to populate them in significant numbers. Imagine an ambitious younger person discussing the future of towns with an elder. The elder might point out that towns have a dicey future because people will eventually want to marry and raise families. But, in a town this would be difficult as you don’t know the families of prospective spouses so you can’t make good choices. Further, you don’t have the benefit of more experienced older people to guide you. And the midwives in the town, if there are any, will not be familiar with your family’s history and so won’t be able to help you if difficulties arise in child birth. These are all legitimate concerns but towns became prominent anyway, many eventually becoming large cities.

There is no end to the list of similar examples where things change in ways that were incomprehensible to people who were used to a different world. Today we elect King’s. People choose their own professions. Overwhelmingly most people don’t grow food. And on, and on. The point is that what you are used is nothing more than what you are used to. It is not reality. It is merely the way things happen to be done at the moment.

So, how might things change to accommodate the problem of sex in Virtual Worlds? Let’s start with the easiest and move to the more difficult to imagine. First, we might just make social accommodations. That is, the partners we choose for procreation come from a different pool and serve different roles than the people we work with or attend classes with. This is not all that farfetched as we seem to be moving in that direction anyway with dating websites.  So, let’s take it a step further. 

Imagine a future where procreation can occur through artificial insemination. So the people involved never have to meet.The idea of mailing reproductive material to the one you love probably sounds repugnant to nearly everyone. But, don’t forget that just a couple hundred years ago the idea of storing blood in bags for people who need transfusions would have been shocking as well. Taking the organs out of dead people to put them into living people sounds even worse. So, the idea of separating reproductive materials from their donors is not that different.  Another problem with this idea, in the minds of most people, is that it separates sex and reproduction. But, even that does not hold up upon inspection.  Even in today’s world, the amount of sex allocated to reproduction is a tiny fraction of the amount of sex dedicated to fun.  

Let’s take this one step further. We are not that far off from having artificial blood and artificial organs. So, it is not unreasonable to imagine that, at some point in the future, we will have artificial reproductive materials as well. If this happens, then the need for proximity in reproductive sex goes away entirely.It may take some getting used to. But we have gotten used to a lot in the past and will get used to more in the future.

I realize that these ideas are probably repugnant to most readers and I am not suggesting any of them as a desirable future. That is a different debate entirely. I am merely saying that what we are used to what we are used to and people in the future may well be used to something different.

To drive this idea home consider a future, perhaps a couple hundred years from now where reproduction is handled neatly by combining reproductive materials in a clinically controlled incubation environment. Then imagine that the technology is lost somehow. And it is your job to convince people to go back to the old way of doing things. How difficult would that be?

Friday, January 3, 2014

What About Hugs?

One of the talking points that people often raise in their resistance to the idea of Virtual Worlds is that the people in Virtual Worlds do not have any physical presence. This in turn leads to a host of derivative complaints. You cannot touch people. You cannot see their facial expressions. You cannot hug your friends or loved ones.You cannot trust someone if you cannot look them in the eye. And, so on.

There are three problems with this line of reasoning. First, it is a distinction between Virtual Words and physical presence that is used as a wedge issue simply because it is a distinction rather than a real concern. Second, upon reflection, it is probably not a real concern. And, third, even if it were an issue, the evolution of technology would more than likely eliminate it. It is like saying that I don't like people who are not from the United States because they don't speak English. First, it is probably not really true. Second, if it were true, it probably wouldn't be as big of a problem as I am making it out to be. And, third, the evolution of translation software has made different languages less of a problem than it used to be. So, let's look at these counter arguments individually in more detail.

While physical presence is a distinction between Virtual Worlds and the physical world, one has to ask how important this distinction really is. Do you really want to touch people that badly? I would cautiously suggest that if I were to start touching my students, I would have a much bigger problem than this one of trying to overcome resistance to Virtual Worlds. Touching is way over rated. Next time you go to the grocery store, I would encourage you to hug the cashier and see how well that works out. Or, next time you get pulled over for a traffic ticket, get out of your car and put your arm around the person writing the ticket. I am not saying that there is no circumstance in which touching is a good thing. I am just saying that its importance as a wedge issue between virtual and physical worlds in our daily affairs is vastly overrated.

Second, physical presence is not as big of a deal as most people claim. Back in the days of limited mobility people wanted to know the families of people that they married. This was considered a critical point. But, in today's more mobile society, more and more people marry significant others with little concern for their family background. It used to matter was town you came from and was school you went to. This is all rapidly eroding with the increasing mobility of society. Similarly, we used to think it was important to know about the people around us. But, in our modern age of personal privacy we have had to learn to get along without that information. Similarly, we believe that we need to see people faces, look them in the eye, and watch their body language in order to understand what they are saying and whether or not we can trust them. This is probably a similar convention which will evaporate over time.

Third, technology over time will advance so some of the information we seek from presence may eventually become available.Here are two examples. First, with the staggering amount of information available on the Web, it might be possible to find out way more about the people with whom we are interacting that we could even find out by just talking to them face to face. And just like the way  people on dating sites who won't provide a picture are ignored, we might just choose to ignore people for whom there isn't a sufficient amount of information online. Pursing the dating site analogy further, most people are skeptical of others who post pictures that are clearly decades old. If people avatars do no look like them, we might just shy away from them. Finally, as digital video technology advances we might be able to project real time facial expressions on avatars. So, no only will you be able to look them in the eye and see facial expressions in real time, but you can record them, watch the expressions over and over again at your leisure, and even run them through analytical software to see if they were telling the truth.

Suddenly, the physical world, where interactions are dicey at best, doesn't look so good any more.