Wednesday, March 28, 2012

The Wet Blanket Test

I just read one of the strangest suggestions that has ever been offered on the topic of innovation. In a book that intended to suggest to companies how they can break out of business as usual and discover innovative solutions, the author suggested to build an innovation team by assembling a group of people from various disciplines and departments who ... and here comes the horrible bit ... have ten or more years of experience at a company.

On the surface, this might seem an entirely reasonable suggestion - seasoned people have a deep understanding of the firm's practices and culture and should be able to quickly focus on ideas that have the greatest potential. However, there's a reason that firms remain rutted in their present practices - and the reason is typically these very same people. Employees with ten-plus years experience are knowledgeable, but they are very often institutionalized - it's a stereotype, granted, and there are likely some who can keep a fresh perspective after being at the same job for such a long period of time, but it's likely such individuals are unusual, especially in established corporate cultures.

If you've observed or participated in focus groups, you're likely familiar with the mother-hen type who herds the rest of the group away from anything unfamiliar: they'll interrupt someone in mid-sentence with a phrase such as "that won't work here" or "that's not the way we do things at this company" to squelch anything that breaks from traditional practices, until the conversation collapses and other participants stop making suggestions, and the exercise is thoroughly undermined so that no truly new ideas are produced.

This seems to happen fairly often: get a ten-year veteran at a firm in a room with people with less time on the job - even if the others have more experience in their profession or industry - and they smother innovation rather than creating it. Whether it's the desire for others to kowtow to their years of service to the firm or the malicious joy of the barren spinster who shoves a pregnant young woman down a staircase to make her miscarry, the tendency to obstruct others is an ugly and despicable part of human nature, but it's a part of human nature nonetheless that expresses itself, almost without fail, in any group of people.

But to escape the dark quagmire of the psychology of the institutionalized employee as an individual and consider the problem they create for their firm ... being innovative necessitates defining a possible future, unencumbered by the limitations of the past. If you begin by considering what has been done over the past decade, rather than what might be done if you break from the current way of things, then you're looking in the opposite direction to where your future lies.

That's not to say that institutionalized persons do not have a contribution to make to a process of innovation - but absolutely not as part of a team that comes up with fresh ideas. Instead, their value is in following their nature: to smother and destroy any new ideas that don't fit what they perceive to be the corporate culture, and to which tenured employees will refuse to adapt. The tenured employees are the perfect group to assess what ideas will be difficult to implement, because they and their kin are the very ones who will create obstacles and sabotage plans in an attempt to avoid moving form the comfort and familiarity of tradition.

However, their smothering instinct must be mitigated: when an institutionalized employee suggests that an idea won't work, ask the question "why?" and do not accept "it just won't" for an answer. Sometimes, it's just naysaying and cynicism, but other times there is a valid reason that a given idea will not work given the current resources and culture of the institution. Then, dismiss the given - would the idea work if we addressed the factors that prevent it from being viable? Is the organization able and willing to make those changes to gain the potential benefit?

Sifting through ideas and examining all the things that could prevent them from succeeding is a valuable task, but it needs to be a later step in the process of innovation - and to the original point, it's certainly not something to be done at the very beginning, or in a brainstorming session - including the institutionalized insiders will smother innovation before it can even begin.

Saturday, March 24, 2012

Pandering to Obsessive-Compulsive Disorder

One criticism of technology is that it panders to dysfunctional behavior, exacerbating or even creating problems. In particular, it's been suggested that technology feeds obsessive-compulsive disorder and may go so far as to create it in otherwise mentally sound individuals, or at the very least leads them to mimic the symptoms and suffer the consequences.

Perhaps that seems an exaggeration - but consider the mentality of a person who carries a small notebook. Every time he goes anywhere, he makes a note of his route and places a tick-mark on the page with each step that he takes. Every meal he eats is recorded in detail, how much of what things, where he was at the time, what his opinion of the taste and texture of each element. Every conversation he ever has, he writes down the names of people present, the time and place, and everything that was said, verbatim. Every waking moment of his life is cataloged, recorded, and quantified. He goes home at the end of the day and transcribes everything into ledgers, makes charts and graphs, writes a list of prescriptions and admonitions he will follow the next day.

Such a person would be considered unusual, borderline insane, and the very act of recording and ruminating over the minute details of his daily life is behavior that should likely be discouraged. He's not really harming anyone but himself, though likely annoying quite a few, and causing everyone who observes his activities, or to whom he describes his mad plans, wonder if he might be dangerous.

And yet, there's a mobile application for each of the behaviors described above, and more. Granted, the act of recording the details in a notebook is a serious intrusion and impediment into real life, and he could seem outwardly more well-adjusted if the task were facilitated by an electronic device. But it's not the act of recording that is most questionable, but the compulsive need to keep track, and his obsession with the data he collects.

The clinical definition of OCD indicates that behavior is to be considered problematic when people devote an inordinate amount of attention to unnecessary details and, by doing so, cause themselves grief and distress. We tolerate and even admire to some degree a person who is able to accomplish things by being meticulous and attentive to detail - but when they don't seem to be accomplishing anything, yet persist, the behavior is questionable.

Take for example "diet" products and programs that require participants to record their food intake in detail. How many people do you know who are on this program, who have been on this program for a period of years, and who have not accomplished much by it, except to give themselves a task of recording details at every meal, suffer woe and regret over the things they do not allow themselves to eat, and turn any conversation that has anything to do with food into a lecture about calories and "points," so frequently that people avoid mentioning the topic in their presence? I don't think anyone can convince me that this is sane and healthy behavior.

It's worse that this is further encouraged by the healthcare profession, whose consideration of statistics leads them to ignore the patient before them, and instead prescribe medication to treat numeric data when a person is clearly suffering from neither disability or discomfort. And the medication isn't taken to cure a problem, but becomes something the "patient" has to purchase and consume constantly, for the rest of his life.

The alleged point to this obsession over numbers is a desire to understand and ultimately improve the human condition. The numbers impart an arcane knowledge of things that cannot be seen with the naked eye, or might be seen but would normally be dismissed as unimportant unless they could be recorded and quantified (as if the act of recording and quantification imbues some significance onto the insignificant) - but even advocates of this are hard pressed to demonstrate that this obsession with data ever makes possible any real improvement in the vast majority of individuals who are encouraged to obsess.

Aside of my ranting about bizarre and disturbing cultural trends that make people tedious and boring, there lies the question of ethics. I have the strong sense that the future will regard our present obsession with numbers as the equivalent of the dark ages obsession with spirits and demons. Wouldn't "triglyceride" makes fine demon-name?

And isn't this obsession with numbers accompanied by just as much quackery, superstition, and downright stupidity as belief in mystic forces? And is it at all ethical to sell a gullible public various powders, potions, talismans, and other devices that don't even cure the conditions they probably don't even have, but are convinced are important because they are represented by mystic sigils we call "numbers"?

I have the sense this has turned into a rant for the opposite extreme, which was refreshingly purgative, but perhaps muddles the point: to measure what matters and gather data to make an informed decision is a sound practice that leads to a more informed decision and a better outcome. So in some instances, arguably many, it's entirely worthwhile: but when it becomes a pointless exercise - the data doesn't inform any decision, and positive outcomes are not achieved, and when then act of measurement becomes intrusive, it's clearly dysfunctional.

Tuesday, March 20, 2012

Nine (Bad) Predictions for Social Media

I recently read Scott Klososky's Manager's Guide to Social Media. I've read a few other books on the topic of social media and will likely be reading a few more on the same topic this year, as it's an emerging area in which it's difficult to sift through the hype to find the genuine value.

To that point, the author made a number of predictions for social media - things he expects to materialize in the next few years - some of which seem reasonable, others over-enthusiastic. I figured it was worth keeping the list, and I'll check back in a few years to see how it matches up,

Prediction 1: Tools will become standardized

SK points out that there are multiple tools for various tasks - blogging platforms, IM clients, media sharing sites, etc. - and that convergence is inevitable. Users dislike having several different tools to perform different tasks, and having to join several sites to access different peoples' content is a nuisance the market won't long tolerate.

I disagree. While the rationale is good, the same has not happened for older technologies. There are still three or four "major" Web browsers that people use, and the situation is similar with search engines, e-mail clients, and other Internet tools.

I expect social tools to follow the same pattern: there will be a small number of leading brands, and variety and the pace of change will slow, but I don't expect there to be a single standard.

Prediction 2: There will be a central database of people

Another problem SK sees is that users have to create a separate profile for each social tool that they use. So in the future, a person will have a single source that stores all their personal information and multiple other sites leverage this same profile.

I disagree. Aside of the problems of standardization, my sense is that users do not want this at all. While it would be more convenient, it's also contrary to the desire to present information that is relevant to a given forum: the information a person presents about himself on Facebook is different to the information he provides on LinkedIn - details such as age, gender, ethnicity, religion, political beliefs, or other fields that, which appropriate to a social profile, would be entirely inappropriate for business contacts (especially potential employers) to be informed of.

With that in mind, I predict that there will be separate profiles for each site, though I do notice that Facebook is becoming a central profile that nice sites will leverage, my sense is people will wish to disclose different information depending on the "crowd" they wish to interact with using a given tool.

Prediction 3: A centralized directory will emerge

The author also mentions the difficulty in finding people online. The entire point of social media is to connect with others, and it's difficult to do so if a person who wants to connect must sift through the various social media sites to find out which one their desired connection utilizes.

I'm inclined to disagree, but I am not entirely clear on what the author is envisioning. It seems to me he envisions a centralized white pages, where you can enter a name and perhaps a few more details to get a list of all their social media presences.

If this is his suggestion, then I certainly disagree. There are already a number of small players who are attempting to become the white pages of the Internet by aggregating profiles from various sites. The problem is that they are inaccurate and, in order to be accurate, the individual would need to establish an account to verify the details. This has not met with much success, though it's likely because what I have seen are fee-based services beyond a basic search.

On the other hand, sites such as Google do a fairly job of doing so already: search for a person's name and you will find links to the various social media sites they use. If a person has a common name, you can enter a detail such as the firm they work for or the town they live in to narrow the search fairly well. As such, I don't see a need for a centralized directory, and doubt there's much market for one, though it may be likely that a highly popular site such as Facebook will provide this sort of functionality (arguably, it does so already).

Prediction 4: There will be a resurgence of virtual worlds

The author maintains that as bandwidth and 3D rendering increases, we will see the re-emergence of immersive virtual worlds, similar to Second Life, in which people can interact with avatars of one another in simulated physical environments.

Not only do I disagree, but I the notion laughable. The excuse of bandwidth and rendering engines may have been true in the days of modem connections, but the problem of Second Life wasn't the technology, which was really quite good, but that a "virtual world" is awkward to navigate and interact with. It was, and will likely continue to be, a bad user experience no matter how smooth the animation or detailed the textures.

I can see niche applications for virtual worlds or 3D modeling, but I do not expect they will overtake text-based communication in social media.


Prediction 5: Businesses will leverage crowdsourcing

SK predicts that as more and more people become connected, there will be greater opportunities for businesses to outsource tasks to large groups at minimal costs, ad that more work that is presently handled by full-time employees will be handled by people who wish to sell a few hours of their time here and there.

I disagree. The infrastructure for crowdsourcing has been around for years, and the vast majority of people who have the credentials to serve as knowledge-workers are already online. But I don't see that the business world has learned to adapt to this method of working, and have even greater doubts about employers' willingness to offer adequate compensation to gain the services of large numbers of qualified professionals (the entire draw to crowdsourcing is to get people to a lot of work done for virtually no cost).

As with virtual worlds, I see this as a niche interest that can be applied to a small number of efforts, and I don't predict that this will change in the next couple of years.

Prediction 6: A people rating system will emerge

The author rightly points out that trust is a major issue online: we do not know if the people with whom we are interacting online are who they claim to be - from the middle-aged pervert posing as a teenaged girl on dating sites to the fictitious personas used to apply for credit cards. The author foresees a person rating, similar to a credit rating, that certifies a person's identity and credentials.

I agree that there is a need for this, but disagree that there will be a centralized service or rating agency. For the most part, a person's "real" identity is unimportant when interacting with strangers online in a social manner. Where it is important, the tell-tale signs of an impostor are fairly easy to spot (a new account with scant information and few friends or connections). Where it is critical, there are other methods for validating a person's identity that businesses presently use before they will extend credit to a person online.

In all, it would require a major effort and widespread participation to address a problem that doesn't surface very often, and for which there are currently reliable solutions.

Prediction 7: There will be more open access to social content

SK points out that there is a lot of information bottled up on social media sites, and significant desire to be able to sort through it all. While search engines allow us to retrieve some information, much of what is posted remains locked away, and the author predicts this will change in the next few years.

I disagree. While the notion of transparency is much touted by those who want to be able to gain access to information about others, it is also very much resisted by those who wish to be more selective in determining who gets to see the information they post.

The trend among social sites seems to be to allow greater blocking/filtering rather than less, and I don't expect this will reverse itself in the next couple of years. All it takes is one sensational media story about how someone was victimized by a person or exploited by a corporation to batten down the hatches tight.

Prediction 8: Game skills will be valuable to business

The author points out that there are a lot of people among the younger generation who have considerable experience at playing video games, and that the business world will soon discover a way to put these game skills to work in a professional capacity.

This is another prediction I find laughable: it's likely been a dream of many kids that their high score at Mario Brothers entitles them to a six-figure income in real life, but it's never been so, and it's highly doubtful the corporate world will risk its reputation on the "skills" of gamers.

Primarily, gaming involves solving a puzzle that is designed to be solved - that is, a game is by its very nature rigged so that success is relatively easy to achieve. While it takes considerable expertise to design a puzzle, solving a puzzle requires only observation, trial, and error within the safe environment where reactions are predictable, completely unlike the real world where parameters are unpredictable and unknown.

The consequences of video gaming are likely more sinister for the business world: employees will expect major rewards for little effort, prefer repetitive tasks, shun risk and uncertainly, and have little genuine creativity. As customers, they will be able to examine business processes and procedures to find shortcuts and loopholes that benefit themselves at the cost of others.

The notion of "gaming" a business already has a negative implication, the only positive implication of which is that being cheated by one gamer teaches a firm how to avoid being taken advantage of by others that would make the same moves - which means that customers, employees, and others with whom the firm interacts can (continue to) expect to be treated as if they are dishonest by default.

Prediction 9: More people will be involved in paid promotion

The major reason marketers are attracted to social media is that third-party testimonials are more credible to prospects than claims from the advertising department. With this in mind, the author predicts that businesses will seek ways to pay people directly for mentioning their products and generating positive buzz, and there will soon be "an explosion" of pay-per-mention programs.

I disagree. People are innately aware of something that many firms seem to have forgotten: that integrity matters. We value word-of-mouth more than paid advertising because it is genuine: a person tells others they got a great deal or a great product because they genuinely feel that way, not because they have been bribed to do so.

There have already been a few incidents in which an influential blogger or popular online personality has lost credibility, even been anathematized, when it was discovered that they received compensation for promoting a product. Seeing people who had spent years building credibility have their reputation ruined for a free t-shirt or a ten-dollar discount makes it less likely that others will make the same mistake.

As such, I expect a few firms may try this, but doubt that many reputable people will be interested - or if they are, the consequences to social media may be even worse: if it becomes an environment where you can't trust what other people say because you suspect they are being paid to say it, word-of-mouth online will lose its credibility as well.


Friday, March 16, 2012

Natural Interfaces: Unready

Conceptually, I'm intrigued by the notion of natural interfaces - being able to escape the tyranny of mouse and keyboard, and use "natural" motions and the spoken word to be able to communicate with a machine has a certain wondrous glimmer about it. That is, until you actually consider what it would look like.

To be specific, I saw a kid in a store playing with a demo video game console that eliminated the controller by using a camera to recognize the player's body. After a brief configuration, the game could track the movement of his body, so that by moving about in space he could control his avatar in the game. He seemed to be having a blast - but to any observer, he looked like he was having a seizure until you got close enough to see what he was actually doing.

I can't imagine that the same technology could be put to more practical application, at least not without hilarious results. Imagine an otherwise distinguished-looking person standing in front of an ATM on a public street, gesticulating like a spastic wizard in an attempt to use "natural" gestures to communicate "I want to withdraw fifty dollars from my checking account."

Of course, you could use language instead of gesture, but depending on where the ATM is located, it might not be wise to announce to anyone within earshot that you will very soon have cash in your pocket. But that's more tragic and less funny than watching someone do an awkward impromptu dance routine in a public place to perform a simple task.

I also question how "natural" the natural interfaces actually are. I am at this moment sitting in front of an Apple computer, with a "magic" trackpad that is collecting dust because it requires me to learn a number of un-natural and un-intuitive motions to perform simple tasks. Maybe it's just that I'm accustomed to using the scroll-wheel on my mouse that makes it seem awkward to have to learn a new method of navigating a Web page.

And in fairness, each gesture seems very simple - scroll by stroking the pad with two fingers (not one, and not three) and use a pinching motion to make something smaller - but the fact that I can already do these tasks easily makes the language of trackpad gestures into something I would have to devote time to learning and become accustomed to. I don't accept the notion that these are natural or intuitive motions ... I don't "naturally" sweep two fingers across a piece of paper to move it around on a physical desktop.

Using gestures means having to learn an entirely different and unfamiliar physical language. I think I would type much slower if I had to spell out words with sign language than merely type them into a keyboard (admittedly, I might be more accurate if I were slowed down a bit) - though I might initially be amused by the novelty of doing something "different" it would not ultimately be a preference due to the awkwardness and inconvenience.

To tie up this meditation, I am still intrigued and amused by the notion of natural interfaces, and hopeful that something better will come along - but my sense is that some technologies are being rushed to market that are not yet baked out quite enough and remain inconvenient to use and embarrassing to be seen using.

Monday, March 12, 2012

Customization as Abdication, Part 2

I sense I may have been too oblique and abstract in my last post, in which I suggested that burdening the customer with designing their own product is an abdication of the responsibilities of the producer. I've got an example that will help to concretize that perhaps a bit more:

There's an Asian restaurant in my area that gives customers two options: you can order an item from the menu (standardized product) or you can go to the stir-fry bar and pick out your own ingredients and sauces and hand them to a guy who will toss them into a wok for you (customized). I've never gone for the customized product here, and have seldom seen anyone else doing so.

The entire reason I go to a proper restaurant (not merely a QSR when I need to cram something in my guts to make them stop complaining) an pay a premium is because I value the skill of the chef - he has the expertise to know what ingredients to combine and in what proportions to make a "good" dish. The service that I am paying for is the expertise of a skilled professional who knows what he's doing and will make better choices for me than I am capable of making for myself.

In all fairness, some customers clearly feel otherwise. If no-one at all wanted the customized offering, the bar would have been taken down at some point and replaced with more tables - it's a huge waste of cash for the restaurant to chop up so much product and throw it away at the end of each day (which I hope is their practice) - so to my earlier point, different customers prefer different things, and it's probably wise for the restaurant to offer both extremes.

Another example is a chain I saw in Dallas, steakhouses where the patrons cook their own steaks. I can't say much about that place, as I never set foot in one and find the very notion laughable - but they're still in business, which means that a sufficient number of customers actually delight in this option.

But to the point ... if a restaurant merely provides ingredients and equipment, and customers must come up with their own recipe and cook their own food, then what value does the restaurant provide to its customers? They peel and chop the vegetables, start the fire, and that's it. Hardly worth the significant difference in price between a restaurant meal and cooking at home, and I sense that's part of the reason that there is a great revival of interest in home cooking.

And to go further ... if all restaurants were to move to this business model, how does one restaurant differentiate its service from that of the one next door? If the only difference is in the choice of ingredients, I don't see that as sustainable: if the only advantage you have over a competitor is that you have parsnips, and customers seem to value that difference, they can match your offering the next time they place a grocery order.

And as usual, I feel the need to state I'm not arguing for either extreme, though I do sense the pendulum has swung too far in one direction for certain offerings, to the point that providers fail to provide sufficient value. There is no single answer of how much of the design, and even the labor, in producing a product ought to be laden upon the customer - while I do like having some choices, I also expect a vendor to apply some expertise in identifying a product that suits my needs. It's their job, and it's what I'm paying them to do.

Thursday, March 8, 2012

Customization as Abdication

One of the drawbacks of customization is it places the burden upon the customer to decide what options best suit their needs. There’s a careful balance to be struck between the extremes of burdening the customer with design decisions and restricting the customer a standard offering.

The era of mass production was very much about standardization: a product could be produced very efficiently if every unit is exactly the same and the customer can take it or leave it. Henry Ford’s statement that “a customer can have a car painted any color that he wants so long as its black” reflects this attitude – and he held fast to it until he began losing ground to competitors who offered other choices.

In recent years, technology presses us a perhaps too far in the opposite direction. While it is in some instances delightful to be able to pick and choose among hundreds of qualities, features, and options, it’s now become quite a chore to have to tell a product manufacturer all of those details.

I don’t see a way to arrive at a perfect blend for every customer. No matter how you balance the freedom of choice against the burden of having to choose, someone will tell you you’re making it too hard and someone else will tell you that they feel constrained. You will never get that right for all people.

But more to the point, my sense is that technology is pressing us too far to the opposite extreme, where the customer is burdened with many choices – not just from different providers, but from providers who refuse to design a product and instead lay that task on the customer.

You can wrap that in whatever rhetoric you please, but it comes down to a lack of competence, or perhaps a lack of moral fiber. If the producer makes the decisions, the producer is to blame if customers don’t accept the product. If he can duck the responsibility of making a decision, but instead depend on the customer to do so, then the customer must accept the blame if the final product doesn’t suit his needs.

And that is the abdication of service. If a producer cannot provide a solution to a customer’s needs, but merely follows the customer’s preferences without lending any assistance to the decision-making process, the question becomes: what is that producer really doing for the consumer?

Ultimately, the producer of a product contributes value to his customer by providing a solution. The physical good produced is merely a means to that end, and likely less important than the intangible element that makes the physical product of value.

To be clear, I‘m not arguing in favor of the opposite extreme of standardized products and no customer choice –a balance must be struck that does not go to the exact opposite extreme of requiring the customer to design a product that serves their own needs. In may instances, the customer doesn’t know – and what he is paying for is your expertise in determining a solution, not merely to make a thing according to someone else’s directions. To refuse to provide any assistance is to abdicate the role, and responsibility, of a service provider.

Sunday, March 4, 2012

Customer Rebound

Most firms, or at least many, are very good about user experience for any interaction that it is in their immediate interest to accommodate, and completely dreadful about any interaction that isn’t. It’s very easy to sign up for cell phone service, and very difficult to cancel your contract. It’s very easy to purchase an item and a department store, but very hard to exchange it if you later find it’s the wrong size. There are many examples.

A remark I hear fairly often is that companies purposefully make these transactions very difficult so that people will just give up. It’s a justifiable perception, and the motive for firms to do so is unmistakable … but I’ve never been called upon to purposefully make a transaction harder to complete.

I must admit, the notion has its appeal to splice together some of the awkward and difficult elements I’ve seen into a single, horrific task flow that will completely defeat all attempts to successfully complete it, just as an exercise in absurdity that will cause a great deal of grief to people I will never see. But then, I’m a little bit evil that way.

But the truth is that I have never been involved, nor heard from anyone who has, in a design process in which the business requirements were to make the transaction as difficult as possible for the user so that they will give up and the firm can keep their money. It just doesn’t happen that way, at least not on purpose.

What also doesn’t happen is that a firm shows much interest in investing a single dollar or allowing any employee to devote a minute of their time to considering how to make those transactions easier. They’re as difficult as they are not because of attentive design, but for lack of attentive design. Simply stated, if it weren’t for user experience professionals, every transaction would probably be that clumsy.

But this is where the core of the accusation is supported: firms choose not to spend any resources making such transactions easier on the customer because there is no ROI. That is, they don’t purposefully make them harder to keep the customer’s money, but they don’t purposefully make them any easier (though perhaps intentional neglect is the moral equivalent of purposeful harm if you know it to be so) because it doesn’t make any money for them – which seems like a roundabout way of saying exactly the same thing.

The problem for user experience professionals is convincing firms to spend budget on transactions that don’t make them any money – no ROI , no resources. The closest I’ve come to succeeding is to suggest that the firm take a look at its rebound rate: the number of customers who closed their accounts but reopened them later. If the last impression made on a customer is a negative one, the chance of their bouncing back is significantly diminished – or conversely, that if the “exit” task is made less painful, the number of returning customers would be increased.

Naturally, that idea didn't get very far at all: it was promptly rejected, and employees were forbidden even to discuss the idea any further.

I still think that would bear out in reality … but I would likely need some hard numbers to revive it from the trash heap, at least at this particular firm. I’ve not seen any research into the phenomenon, for or against, so my sense is the idea has not been given much consideration anywhere outside of the industries such as retail where it seems to have taken root.