Tuesday, December 30, 2014

Language and Thinking

There is a circular argument about the manner in which language limits thinking.   The argument that our ability to think is limited by vocabulary is used to suggest that we cannot understand anything we do not have words to describe, but the counterargument is that we have the demonstrated ability to coin terms to describe things for which our vocabulary is insufficient.

It must be acknowledged that we are unable to gain knowledge that cannot be articulated through second hand-experience. That is, if we gain information about something from another person, in speech or in writing, then the other person is only able to communicate knowledge for which he can convey by means of words.  Our ability to receive information depends on his ability to send it, and his ability to send it depends on his skill in communicating.

So it must likely be conceded that we cannot gain second-hand knowledge of anything if our own language skills and those of the source are inadequate to communicating a specific bit of information.   But this does not mean that understanding is limited in a general manner: knowledge proceeds language, and is not derived from it.

First-hand experience defies this limitation because the information we receive is not limited to the verbal channel.   For example, there are sensory experiences that we remember and develop an understanding of, but cannot consider verbally - there are no "words in our mind" while we experience a sensation - and which cannot be adequately described to us by anyone else.   While ineffable, it is a sense-memory impression that becomes for us a part of our mental model of a thing, and in some instances quite an important one.

As an aside, this also addresses the notions about intuition or extrasensory perception.  There is to date no method of describing how these processes work, but it is plain that there are sensations that we have that are ineffable.  So to "sense" something is likely to have knowledge that cannot be articulated.  It is well within the range of sensory perception, but beyond our ability to verbalize.

It is not uncommon to notice that someone is adept at something but cannot explain it.   Ask an old person who is able to routinely select the best produce in the market how she does it, and she is likely to say that she "just knows" from having done it so many times.  That is to say, she possesses knowledge but not the ability to convey it in words.

We have the same experience when, after concentrating intensely upon a problem and failing to find a solution, it seems to "pop" into our minds when we stop trying to think about it.   Our mind is able to make connections that we cannot describe in words, and by leveraging this unspoken and unspeakable knowledge, we are able to derive a solution.   It is not that we were not thinking about it, simply that we ceased limiting our mind to our language.

Another example is the difficulty we have in describing an experience to others sufficiently.  We often find we cannot do so.   Consider how you might describe the flavor of a pear to someone who has never tasted one - it is impossible to do with sufficient accuracy that they would confidently know upon tasting one that it is the same thing you had described.

Language, therefore, is a tool of communicating thought, but not the stuff of which thoughts are made.  A great deal of confusion and misunderstanding comes of failure to make that distinction.

Friday, December 26, 2014

Turning Criticism into Collaboration

“Everyone’s a critic” is a problem designers know all too well.   When you show a design solution to a group of businessmen, computer programmers, accountants, and managers they love to pick it apart and can be highly creative in imagining scenarios in which your design will fail, spinning up hours or days of work in which you have to do some deeply ridiculous things to accommodate their criticism.

While we have little love for critics, it’s unavoidable that people with little understanding of design are going to criticize your designs – and as much as you might relish the thought of bringing a case of pacifiers and a tube of super-glue to design reviews, that’s a choice that the HR department will likely rule out.

It is also not very productive to turn the situation into a battle of wills, which is a contest to see which person can be the most infantile.   Designers in general suffer from the stereotype of throwing tantrums when someone criticizes their work, and while that can be purgative and entertaining, it’s not a solution to the problem.

Neither is it particularly productive merely to silence our critics.   Everyone who’s invited to a design review has something to offer, and knowledge that must be leveraged in creating an effective solution – which is the reason they were invited (or saw fit to invite themselves) in the first place.   So instead, we must find a way to channel their negative energy into something more positive.


Criticism is not Collaboration

The first thing to do is to inform the culture of what it means to genuinely collaborate – because some critics think that their nit-picking is in some way helpful to the goals of the group, the project, and the organization.   But there is a significant difference between criticizing and collaborating.   Consider these key three points:


CRITICISM


COLLABORATION

Gives Problems to Others


Gives Solutions to Others


Makes Conversations Longer


Makes Conversation Shorter


Takes Us Further from the Goal


Gets Us Closer to the Goal



Both criticism and collaboration are part of the problem-solving process – you must find a problem before you can solve it, but the work does not end there.  If you’re going to make a contribution to an effort, rather than just creating more effort, you have to provide solutions to the problems that you identify.


Collaboration Requires Labor

It’s no accident that the word “labor” appears right in the middle of “collaboration” because it takes effort to be collaborative, and the “co” prefix indicates that it’s work that two or more people do together.  It’s certainly not a process by which one person gives work to others and then walks away from the problem they just created.   I believe the word for that is “executive management.”

The problem is that people are basically lazy and want to do half the job – or far less than half, because it takes two minutes to point out a problem that will take someone else two hours (or days, or months) for someone else to solve.   But that’s not collaboration – it’s criticism – and rather than silencing your critics, you must get them to be collaborative.


Turning Critics into Collaborators

How do you turn a person who wants to criticize others’ work into a collaborator who does something positive to achieve the goal?   Two words: be stubborn.  Do not be stubborn in ignoring criticism, but be stubborn in demanding collaboration, and do not let them off the hook until you have their commitment to give it.

One technique is to respond to a criticism by asking “so what do you suggest?”   Very often a person who points to a problem already has a solution in mind that they are reluctant to suggest.     That itself can be trouble because the solution they have in mind is often very, very bad from a design perspective.

But instead of dismissing their ideas, help them to improve them by asking two more questions.  “How does that solve the problem you just suggested?” and “How do you know it will be effective?”  That last question is a easy for someone who understands design, because they can cite theory or evidence of design choices that are effective.

You can use these exact phrases to encourage critics to collaborate – What do you suggest?   How does that solve the problem?  How do you know it will be effective? – or develop your own, improvising to suit the situation so that you don’t sound like a broken record every time criticism rear its ugly head.   The point is to get critics off of merely criticizing and contribute something that will move the team forward rather than holding everyone back.


Terminating the Terminal Critics

Again, it’s generally the case that a critic means to be helpful, but there are “those people” who can’t help themselves and who cannot be helped to overcome their negative behaviors.  The “terminal critic” is a lot less common than you might think – and if you try the technique of stubbornly demanding their assistance, they will often recognize their own bad behavior, correct it, and adjust their approach to you in future.  People generally mean well, and you should always assume positive intentions, despite overwhelming evidence to the contrary.

However, there are those special few who insist on criticizing and will refuse to collaborate – and there are toxic workplaces in which a complainer who dumps problems on others is seen as having “management potential.”   You can work to change the culture, or you can get out of it.   But when you are in an organization where the culture is positive and there are a few negative individuals who don’t fit in, and stubbornly refuse your efforts to get them to collaborate, then sometimes the answer is to terminate them.

The technique described is not intended to dumbfound or embarrass critics into withdrawing their criticism, but to contribute the thought and labor required to be collaborative – if their criticism is valid.   But if they are dumbfounded, it’s a strong sign that the criticism is not at all valid, and they will often withdraw it, and then refrain from making unfounded critical remarks in future.

That’s not at all a bad thing, because you’re not discouraging them from collaboration, just from criticism.  And it sends the strong message to be collaborate, or be silent – and in the long run, to be collaborative, or to be gone.   That sounds a bit Machiavellian, and it probably is – but if you repeatedly try to bring out the best in people and you find “the best” just isn’t there, then you don’t need them and neither does your project team.   They will get the message that they are not being productive or helpful, and so will everyone else in the room … and in the end, square pegs have the tendency to pop out of round holes all on their own.   Enough said?


Building a Collaborative Culture

To end on a positive note, you should work to create a collaborative culture in your team and your workplace, and unless you’re in a management position you are going to have to do it slowly, one critic at a time and one criticism at a time.  You will see results, but just as slowly, as your team becomes more productive, working relationships become better, and an atmosphere of criticism turns into one of true collaboration.


This is a critical step in advancing in your own career: moving from your role as a design practitioner to the role of a design coach and even a design leader requires improving not only your individual performance, but also that of your team and company.   And when the powers that be recognize that you’re the kind of person who can do this, your chances of being elevated to a management position in which you can spread positive influence more broadly and more quickly will significantly improve, at least in an organization that values and rewards collaboration.

Monday, December 22, 2014

Constant Innovation

The demand for change is one of the most disastrous conceptions of the present time and the cause of incessant upheaval.   Many who are fond of calling themselves "innovators" demand change, simply for the sake of making a change, on the assumption that anything different would be more likely to produce a positive outcome than the present system.

The psychological basis of an unspecific demand of any kind is emotion devoid of intelligence.   The sense that the present situation is unsatisfactory is largely a construction, though it may be rationalized by selecting certain observable facts that support a case for discontent.   But the discontented are not often clear on what, exactly, is causing their discontent - nor are they clear on the manner in which making a specific change would produce a better outcome or, for that matter, what specific outcome they wish to produce.

In extreme cases, it is more readily noticeable.  It is very common for dysfunctional individuals who experience negative emotions to seek a cause for those emotions, and when they are unable to find a rational chain of causation then an irrational one will suffice. The paranoid schizophrenic sees himself beset by imaginary foes, or suggests that any passing stranger is conspiring against him, or that unseen spirits or inanimate objects are the cause of his discontent.

In the same sense, those who demand change without reason seek to blame and set themselves upon systems to which they assign a failure to achieve the results they desire, or otherwise campaign for a random selection of adjustments to those systems that will remedy their  or otherwise campaign for laws to remedy their imaginary insufficiencies - all without understanding the real causes of the present situation or assessing whether the changes they demand will be effective in achieving a positive outcome.

The history of many companies is marred by such behavior: there are incessant changes, each of which "is a little revolution" that seeks to effect an improvement - but because neither the cause nor the solution has been analyzed to any appreciable degree, their schemes very often fail and more changes are necessary to mitigate the consequences the changes made of the day before.

And if the outcome of ill-conceived improvements is not detrimental enough, the mere fact that constant change is being made does significant damage.   A rapidity of change means that even a promising plan is abandoned before it has a chance to be fully implemented and that there is constant effort and little progress to show for it.   The result of this is not improvement, but chaos.

Particularly in large organizations in which politics often subvert rational criteria for evaluating initiatives, there is constant strife among conflicting parties that prevents any effective decisions to be made.  Or when a party builds sufficient momentum to make a decisive maneuver, it is nullified the by next party to take control before it is effected, or often even implemented.   The net result of doing, undoing, redoing, and then trying something else is at best stagnation and decay, and at worst a series of grave mistakes that provides a speedy route to unprofitability.

This penchant for constant change has spread from the a few selected sectors to virtually all industries - and while consumer spending is often blamed for the current condition of the world economy, the decline of performance is of a much greater magnitude than lack of revenue would suffice to explain.

While it is claimed that something as significant in scale as the world economy makes slow progress by taking small steps in the right direction, it is more accurate to observe that it makes no progress by taking small steps in random and contradictory directions.   In such a situation, the stagnation and collapse of economies is inevitable.

Tuesday, December 16, 2014

Design Thinking Debuzzed

Too often, I've heard the phrase "design thinking" used to cloak a half-baked idea and dismissed it as a buzzword that is spoken by far more people than understand what the concept really means.  Martin's The Design of Business is not likely to solve the problem - people who use words they don't understand don't tend to be fond of reading - but it does at least help differentiate what is design thinking, and what is not, as a means of making discussions more productive.

A deeper problem is that the practice of design is much misunderstood to be an aesthetic or emotional reaction to the superficial sensory qualities of things.   To a non-designer, the word "design" is poorly understood: it is what "looks good" and nothing more - when in truth it pertains to a perspective that leads to more that superficial changes, and defines a highly effective approach to solving problems at a much more fundamental level.

Design thinking is a methodology that leads to innovation: it takes as its focus the individual who is experiencing difficulty in encountering a goal, and creates a solution that enables them to achieve it.   That functions on a much deeper level than the paint on a machine, but to the way in which the machine itself is designed.

When design thinking is applied to an organization, it considers the value-generating procedures rather than the operations of a physical object.  In this sense, an organization is a largely organic machine, as the components that generate the greatest value are the human resources rather than facilities and equipment.   Considered thus, commercial organizations lend themselves very well to design thinking: to recognizing the manner in which their current configuration is ill-suited to delivering solutions to customers' problems.

And in the same way that a designer can apply his creative processes to creating an object that enables a user to obtain value, Martin's book describes a plausible methodology in which the same sort of creating process that would structure an organization that enables a customer to gain value.

So in all, there's something to this, though it takes a few hundred pages to connect the dots, and it's far more profound than a label to be placed upon bad ideas, but a practice that leads to genuine innovation.

Thursday, December 11, 2014

Ideas from Nouvelle Cuisine

To avoid commoditization of customer experience, I periodically make a point of reading topics from other fields that are relevant but not focused exclusively on the subject.   This is a good way to take a broader perspective, and escape the feedback loop of people doing the same work passing the same ideas around until they become shopworn and stale.

This week, I did some reading in a primer on the topic of nouvelle cuisine, the “modern style” of cooking – figuring that a chef attempting to please a diner likely has some parallels to a customer service professional attempting to please a prospect or customer.   And, indeed, I found a number of tidbits worth bringing back to my own field:

Reject excessively elaborate preparation and presentation methods, and shun the use of rich and heavy sauces that overpower the taste of the main ingredient.

In terms of cuisine, the modern style focuses on choosing good ingredients and preparing them skillfully, not by doing things that are labor-intensive or unusual, but getting the basics right.  A proper entree needs only a few basic seasonings and careful preparation – and when it hits the plate it should still look and taste like exactly what it is.

Relating this to customer experience: there are a lot of bizarre ideas that are tried in order to put a “fresh” spin on a product or a transaction, which are done simply because they are unusual and are expected to amaze and delight the customer.   None of this buffoonery substitutes for getting the basics right, and an unobtrusive but competent service experience is always appreciated.

Reduce cooking time to serve food that is prepared on demand and served fresh.

In terms of cuisine, this principle focuses on the quality of the meal served to each individual diner, rather than the quantity of meals served to all diners.  Cafeterias and similar venues practice “utility cooking” that is meant to save time or effort and serve a mass of people poorly-prepared food, which is convenient for the kitchen but repulsive to the patrons it serves.

Relating this to customer experience: beware of making the efficiency of a business operation your primary concern.   Cutting costs to boost profits will always be an attractive option for the accountants who are concerned with expenses, but if it compromises the experience of the customer, they will be less willing to provide revenue from which those expenses are paid.

A second principle for customer experience is customized and just-in-time delivery.  In the present day customers are less prone to humbly accept a product that has been mass-produced and more likely to demand a product that has been tailored to their needs, and they are willing to pay more because customized products are more valuable and relevant to them.

Serve a smaller menu to a more limited clientele.

In terms of cuisine, nouvelle was not created as the cuisine for everyone – the majority of people would cling to traditional ways and were perfectly happy eating slops from a community caldron.   The nouvelle tradition did not lower itself to provide what the masses wanted cheaply, but catered to the wealthy and, over time, gained popularity with the masses without compromising its standards.

“Serving a smaller menu” translates into becoming more specialized and limited in your product lines as well as making products that are specifically designed to meet a small number of needs (perhaps to perform even a single task) in a competent and effective manner rather than offering so many products or so many features that you lose focus and expertise in doing any of them particularly well.

Relating to customer experience: beware of trying to be everything to everyone, as the low quality of experience that characterizes most industries is the result of attempting to do exactly that: to identify and conform to the lowest common denominator in order to attract as many customers as possible.   It can regularly be seen that new products begin with a small following and spread slowly over time – and it is necessary to be patient, rather than attempt to have broad appeal immediately.

Consider the dietary needs of your guests by avoiding red meats, frying, and the use of excessive amounts of salts and sugars.

In the culinary world, nouvelle cuisine addressed a new problem in the dietary habits of the modern age: people no longer suffered from lack of enough food, but suffered from the wrong kinds of foods.  And while red meats, fats, and salt were agreeable to the palate, they were harmful to the healthiness of the diner, which is ultimately in the long-term interests of the chef that serves them.

In terms of customer experience: beware of things that customers think are attractive, but which are ultimately contrary to their interests.   This requires sellers to avoid the “low hanging fruit” of things that customers think are good, and providing a product that serves their actual needs well – recognizing that things that are superficially attractive will secure a quick sale, but things that are actually valuable will secure greater long-term engagements and repeated sales.

Do not shun new techniques and devices, but leverage them for their value.

In the culinary world, cooks of the time were insistent upon using outdated techniques and took the attitude that anything that made a task faster or easier would result in an inferior product.   But the nouvelle school dismissed this irrational prejudice and maintained that new tools and methods were not to be automatically shunned – but embraced so long as quality was not compromised.   The customer will never know, and does not care, if his meal was heated in a convection oven or a regular one – and so long as taste and texture are preserved, there is no reason to shun this new device.

In terms of customer experience: beware of clinging to processes  and technologies simply because there is an insistence that “it’s the way we’ve always done it” but at the same time do not be too quick to adopt any new process or technology if it is detrimental to the quality of experience.  The customer does not know, nor should they care, what software package you use to pay your vendors, and so long as the package delivery service you use gets the item to their door as promised, they should not care about the partner you choose.

Innovation, creativity, and distinctiveness are to be valued over adherence to tradition.

Particularly when you are preparing a well-known and traditional cuisine, diners have a rigid and clearly defined set of standards by which they will assess what you serve, and expect it to conform exactly to tradition.   But nouvelle cuisine is about breaking from tradition and offering clients something they may never have experienced before, and may find pleasurable and even preferable to traditional standards.

In terms of customer experience: beware of giving people exactly what they expect because they have become accustomed to receiving it, particularly when you are adhering to traditions that have nothing to do with the value that the product delivers.   Just because people are used to waiting in line at a cash register doesn’t mean that they value this element of the experience, and it likely should not be preserved simply because it is expected.

Not everything old is bad, and not everything new is good.

I’ve heard seemingly well-informed people pontificate that nouvelle cuisine is a product of the twentieth century, when rapid transportation, refrigeration, and motorized kitchen appliances enabled preparations that had not been at all possible before these technologies existed.    Unfortunately, that perspective is dead wrong.

“Nouvelle” cuisine made its debut in the 1730s, nearly three hundred years ago, and many of the principles of nouvelle are as valid and valued by today’s diners as they were then.  Certainly, technology has made nouvelle cuisine available and affordable to far more people than it originally was, but the concepts it embraces are quite old and well established, and worth preserving.

Often, what people mistake for nouvelle cuisine is the school of molecular gastronomy, which uses chemical preparations and laboratory equipment to do very bizarre things with food, such that it is not recognized as something edible and delivers an experience more of fear and wonder than culinary delight.   In the hands of a skilled practitioner, molecular gastronomy can be quite good and enjoyable as a novelty, though it may be decades or centuries before it becomes common practice in the home kitchen – but in the hands of a novice, it can be quite disgusting and produce a meal that qualifies as “food” merely by the fact that it is not poisonous.

And at the risk of being tedious, the same is true for customer experience: people still seek to fulfill the same needs with today’s products as they have been seeking to fulfill for as long as the species existed, and many of the “old” principles of customer service are not to be discarded simply because of the date they were discovered.  Neither should any “new” principle or practice be automatically adopted on the assumption that novelty has greater value.

Friday, December 5, 2014

Engagement for Engagement's Sake

During my sophomore year in college, I recall an epidemic of engagements on campus.  It seemed that at least twice a week, a coed would show up in class flaunting a ring and announcing that her boyfriend of the past several months had proposed marriage.   The odd thing was, I don't recall that many of them were able to answer questions about the date of the marriage.   Most would say "after graduation," but a few would admit that there was no date and that "we're just engaged."

The reason these incidents come to mind is that I sense the same thing going on in online marketing, particularly in the social media.   Brands seem desperate to "engage" with customers, and while they have the vague sense that this will lead to some sort of long-term relationship, they don't seem to have a clue as to when or how that will happen.   This is not unprecedented, but it seems that the lessons of the past are forgotten every time a new channel arises.

Consider the rash of clever commercials from a decade or so ago: Mr. Six of Six Flags, Joe Isuzu, the pets.com sock puppet, MasterCard's "priceless" campaign, the Quiznos commercial with the bizarre Spongemonkeys, the animated California Raisins, and the Taco Bell Chihuahua.   All of these were very clever and memorable advertisements which created a great deal of engagement: people remembered the spots, associated them to the brands, and spoke about them in everyday conversations.

The problem is that, in a commercial sense, all of these campaigns were failures.   From the numbers that I have seen, not one of them preceded a sufficient increase in sales to cover the cost of the advertising (although the California Raisins campaign actually did sell quite a lot of licensed merchandise, but did not cause a significant change in the consumption of actual raisins).   Customers were engaged, in large numbers, but this did not translate into actual sales.

At the heart of the problem is a fundamental misunderstanding of the funnel approach to marketing management, which provides a false sense in the consistency of a series of actions.   It is assumed that if 10% of the prospects who presently enter a showroom convert into customers, then herding 5,000 people into the showroom will reliably result in 500 sales.  This seems entirely rational, until you recognize that it is based on the assumption that the people herded into the showroom are just as interested in purchasing as those who undertook the effort to visit without being begged or tricked into visiting - which is not necessarily true, especially if deceptive means were used to coax them through the doors.

The result of efforts to herd people into a shop, or to make them aware of a brand through clever advertising, is not merely engagement - it is irrelevant engagement.   People who do not need a product are made aware of a brand, and take interest in it for some reason other than the potential value of the product that is relevant to them.   Their non-buying behavior is claimed to be "engagement" but they remain non-buyers.

It's also speculated that when a brand runs a campaign that is "all style and no substance" that it not only draws bad leads, but poisons the perception of potentially good leads.  That is, the ads create a negative impression of the brand such that people who might have been interested in purchasing come to regard the brand as being no good because of the use of sensational or deceptive tactics.   There is little more than anecdotal evidence of this, but it does seem entirely plausible.

As such, the strategies for engagement in the digital and social media should be regarded (and undertaken) with great caution.   They are likely to call a lot of attention to the brand, but not the attention of the right kind of people and something's not even the right kind of attention.   Unless you are specific in whom you're attempting to engage and have a solid strategy in place to escort them through the rest of the funnel, engagement for the sake of being engaged is not enough.

And like the young coeds in the opening example, the best you'll get out of the deal is nothing, though you may in fact be distracted from more promising prospects while you attempt, hopelessly, to close the deal with those who are interested in a superficial and short-term goal and have no intention of a long-term relationship with your brand.

Monday, December 1, 2014

The Danger of the First Idea

The decision making process involves three separate steps for (1) developing a number of options, (2) evaluating each option for its effectiveness and efficiency, and (3) comparing options to determine a course of action.   But in reality, this seldom occurs.  Instead, the first plausible idea is seized upon and the process of implementation follows shortly thereafter.   This seems more efficient, or is often claimed to be by those who prefer to conceal they'd rather not put the required amount of effort into the decision-making task, but there is a very good reason that the process insists upon three steps.

The reason has to do with the psychological phenomenon of selective attention: when a person takes interest in something, everything else is ignored or dismissed.   This prevents distraction and helps to focus on an idea that has been selected to receive attention, by a process of disregarding anything but the one item on which a person (or group) is intent on focusing to ensure that it is seen through to completion.

While this is highly functional after a sound decision has been made, it can also be detrimental to the decision-making process.  Instead of considering all the various options that might achieve a desired outcome, the decision-maker focuses on the first plausible idea and fails to give adequate consideration to others, which may be more efficient or effective at accomplishing the goal.  In so doing he sets to work quickly on something that may not have the best probability of success, that may be less efficient in its use of resources, may cause collateral damage and unintended side-efffects, and may achieve only a short-term success.

In some instances, this is done unconsciously - but there are other instances in which it is done with deliberate intent: people become infatuated with the firs idea that comes to mind and wish to implement it without giving sufficient consideration to alternatives, and in so doing deliberately refuse to consider the other options that may be available.   Or, if they do consider them, it is only superficial: once the mind has latched onto a single plausible idea, its assessment of other options seems intent not on considering their merits, but on identifying flaws so that they my be quickly discarded in favor of the original plausible idea.

It is particularly interesting to recognize how curtly additional ideas are dismissed, as the same level of scrutiny was not applied to the first idea - as if the initial idea is self-evident and certain of success while any other suggestion must not merely seem better, but prove itself invulnerable to any imaginable contingency.  And still more curious is the inventiveness of the human mind when it comes to dismissing alternatives - a person who has fallen in love with n uncreative idea can be very creative when it comes to inventing reasons to reject any alternative proposal.

Logically, the first idea should be subject to the same level of analysis and scrutiny as any other idea - but this becomes difficult to do when those three steps are mashed into one: it is no longer a fair contest between two alternative courses of action, but a champion-challenger comparison in which the first idea, by simply virtue of having been thought of moments earlier, is presumed to be flawless whereas any challenger is presumed to be flawed.

Whether it is a genuine desire to be efficient, or merely a product of intellectual laziness, giving undue preference to the first idea is dangerous and this tendency should be recognized and mitigated.

Friday, November 28, 2014

Interest and Engagement

The reason that interest and engagement are often blurred into one another is that they do, in fact, have a number of factors in common.

First, the willingness of an individual to give interest or maintain engagement in a task is largely a comparison of effort to benefit.  It is as simple as a causal relationship: If I do this task, then I shall gain its benefit.  This is followed by the assessment of whether the benefits of a task in question are worth the effort that remains to be expended in order to complete it.  A person can be immediately disinterested in starting a task if the equation seems unfavorable, or they may disengage even having started if they reach the same conclusion.

The benefits and efforts of a task are often difficult to gauge.  In the business world, each benefit and effort is monetized to reduce the equation into the simplistic mathematics of money and a "return on investment" is calculated for an array of options, to be considered or compared.   This practice is flawed in that it fails to consider anything that cannot be monetized - from the perspective of a business that cares only for the sums of money it will spend and receive in the short term, this gap is unimportant - but from the perspective of an individual (employee, customer, or other stakeholder), money is often a factor of minor importance.

In our private lives, we make the same assessment in a more qualitative way: we do always not seek to make money for doing something, but to gain pleasure, emotional comfort, intellectual stimulation, and other functional or psychological benefits that do not translate into dollars and cents - nor do the "costs" of undertaking a task monetize so easily: how much is half an hour of "spare" time worth, or what is the money value of doing something a bit boring or unpleasant for that amount of time?

Coupled with this effort-benefit analysis is the level of risk (the likelihood that the effort will indeed produce the benefit expected) and the opportunity cost (other things that could be done with the same amount of time and effort that would achieve a greater benefit).   All three of these judgments contribute to becoming interested in a task that has not begun as well as remaining engaged in a task that has already commenced.

While interest and engagement are similar in that an individual assesses whether achieving a benefit is worth the effort, the manner in which effort and benefit are considered differs greatly between the time before a task has begin and when it is in progress - and failure to recognized those differences leads to inefficiency, ineffectiveness, and even counterproductivity in designing the experience.

Friday, November 21, 2014

Theory of Moral Sentiments

My study of motivation has taken me even further afield, into the realm of philosophy – specifically looking to Smith’s Theory of Moral Sentiments – only to find that this is not at all a different field, but a different perspective on the same one, in a broader and more general context.  The behavior of consumers is merely a subset of the behavior of human beings, and in that regard Smith has a great deal to offer that, while not focused exclusively on customer behavior, is nonetheless applicable.

Foundational to Smith’s system of morality is the notion that actions are undertaken to achieve an outcome – and that the outcome is (or ought to be) the improvement of one’s own condition.   Even when a person acts in the interests of others,  it is always essentially self-directed in that we seek for ourselves some benefit that is derived from benefitting other people.

This notion translates well to customer behavior and the profession of customer experience: the customer purchases a good or service as a means to accomplish an end – it is in some way of benefit to himself (or his household, or some group of people he means to benefit) – and the clearer the connection between the desired benefit and the means to achieve it, the stronger the motivation.

The customer experience professional is meanwhile motivated by self-interest (increasing revenue to the firm he serves, and which in turn employs him), and enabling the customer to obtain the benefit of the products our employers provide is merely a means to that same self-directed end.

This is important to keep in mind, because some professionals get lost in the world of the customer – which can be a means to greater effectiveness (because success often requires understanding and advocating for the interests of that world) provided that the primary self-directed motive is not altogether forgotten.    It cannot be denied, and should be explicitly embraced, that our ultimate motive is to sell product – in greater quantity and over a greater period of time - and delivering an exceptional customer experience is a merely means to that end.

This dovetails nicely with Smith’s theory of morality – in that very often unethical behavior is not predicated on a desire to do evil, but in the desire to do good, but without the long-term perspective.  To be moral sometimes means accepting less benefit now to gain more later, or even to take on obligations that we find unpleasant in the short term to gain a greater long-term reward.

My sense is that companies are often hindered by the very same short-sightedness – that in seeking to make the most profit right away, they lead themselves to ineffective and sometimes unethical actions, when the consideration of their mission over a broader scope and a greater length of time would keep them on the “right” path, both morally and functionally.

But that may be another matter, and likely has been and will long continue to be a theme of my meditations in this area and others.

Monday, November 17, 2014

What is Creative Thinking?

For many jobs, and in many instances in life, creative thinking is considered to be a critical skill – yet no-one who uses the phrase seems to be able to explain exactly what they mean by “creative thinking.”  The definition is hazy and subjective, and after some consideration I have the distinct sense it will remain so - but what I've stumbled across while studying the topic might lend some clarity.

Mental Modalities

In a very basic sense, the mind works in three modes: mnemonic, perceptive, and creative.   They are generally distinguished by their temporal quality:
  • The mnemonic mode contains thoughts of the past.  We call upon memory to remember experiences and stored knowledge that was gained before.
  • The perceptive mode contains thoughts of the present.  We analyze the sensory data we are receiving in the moment to perceive what exists in the present time.
  • The creative mode projects thoughts into the future.   We leverage our imagination to conceive of what might exist in the future.
In that sense, all people are capable of thinking creatively.  If you can imagine what you will be doing this afternoon or think about what you might be having for lunch, your mind is working creatively.  Only the severely mentally impaired are completely incapable of creative thought.

The Degree of Difference

In common conversation, the phrase “creative thought” does not connote merely the ability to think about a possible future, but a possible future that has a significant degree of difference from the reality of the past or present.

By definition, you are still thinking creatively if your answer to “what will you do tomorrow?” is “the same thing I did yesterday, and the same thing I am doing today.”   But people regard such a statement as boring, mundane, and uncreative.  What will impress them as creative is an answer that suggests something unlike the past and present.   They are not considering whether a thought is creative, only whether it is novel.

And so, the quality that people are attempting to describe when they speak about creative thinking is not merely the projection of the most probable future based upon the recent past, but the imagination of an improbably future that departs significantly from past experience.  Only when you do that will they consider you to be creative.

Too Much of a Good Thing

While people consider it “creative” to envision a future that departs from the experience of a past, the degree of difference tends to fall within a narrow range.   The future you envision must be significantly different, certainly, but it must remain within the realm of plausibility.

A person who expresses wildly implausible ideas that seem to have no basis in known reality and seem impossible to achieve is not considered to be a creative thinker but eccentric and impractical.  Their ideas are removed from the reality of the past and present by too great a degree to be considered possible.

But until a thing is done, who is to say whether it is possible?  The greatest minds throughout history have been regarded with scorn and derision by those who could not accept their ideas as plausible.   Granted, for every great inventor, there are quite a few scatterbrains whose ideas truly are nonsense, but only once a thing has been accomplished is it possible to tell one from the other.

All Things Considered

All things considered, whether a person regards you as a creative thinker has as much to do with themselves as it does with you.   That is, when a person assesses someone else's creativity, it is generally in comparison to his own.

If the ideas you express seem like nothing new or different to them, then they will consider you as uncreative.   If the ideas you express are as different or slightly more different to known reality than their own, then they will consider you to be a creative thinker.  And if the ideas you express are so far removed from known reality as to seem implausible, you’re eccentric.

Perhaps the only objective standard is of the completely uncreative mind: if your idea of the future is no different from the past, then it may be fair to say that you are uncreative. But otherwise, it is a very subjective assessment: how much creativity is enough and how much creativity is too much?   There can be no firm and objective standard.

So, Am I Creative?

Because other people compare you to themselves, they are not a good reference to determine whether you are a creative thinker.   They can be counted on to reliably tell you if you are more or less creative than themselves, but whether that’s good or bad depends on how mundane, creative, or eccentric they happen to be.

And so, as in many instances, you’re better off setting your own standard according to the success or failure you routinely experience.
  • If you have difficulty coming up with new ideas, and most often fall back on a proven but laborious solution, then chances are you could benefit from developing creative thinking skills.   
  • If you are constantly coming up with ideas that you eventually discover are completely impractical, then perhaps it’s time to dial back a bit on the creativity and rely more on established principles.
  • If you find yourself somewhere in the middle, relying on convention sometimes, discovering new and practical methods at other times, and occasionally chasing rainbows, you’re likely in the sweet spot.
And all of this should be considered in the aggregate.  A person cannot and should not be creative at all times – the established methods of doing things often become “the established methods” simply because they work very well and the mental effort to try something different is misspent.  But sometimes, you’ll discover that a new wheel works better and it’s well worth reinvention.  And of course, there is the occasional starry-eyed blunder, for which you’ll simply have to forgive yourself.

Each person must ultimately decide the level of creativity that is most suitable to their own experience – the level that produces sufficient amount of success without too much wasted effort.  A person who has achieved that may confidently comfortably consider himself a creative thinker, and disregard what others have to say, though it seems more likely that it will be an constant area of self-improvement and adjustment.

Wednesday, November 12, 2014

Innovating from a Clean Slate

A common approach to improving a product – whether a physical good, a service process, or some combination of the two – is to begin with the product as it currently is, identify problems that create customer dissatisfaction, and solve those problems.   It is an entirely sensible practice, and one that is capable of making many small, incremental improvements to a product – but it utterly annihilates any possibility of making significant and revolutionary improvements that will create innovative improvements that amaze customers and leave competitors far behind.

The reasons for this are straightforward:
  • The as-is/to-be practice makes the assumption that the “as is” state is generally acceptable and needs only minor adjustments, so nothing revolutionary is ever considered
  • The same process tends to identify many small problems that are easy to address, so people spring to action to make “quick wins” while the bigger and stickier problems are ignored or unaddressed.
  • Where physical products and service procedures are already similar across firms in a given product category, all firms that use the as-is/to-be comparisons see the same problems and derive the same solutions, ensuring that products remain commoditized
Revolutionary improvements require revolutionary thinking, which is a different process to examining and making minor adjustments to what is known.  Innovation on this level requires clean-slate thinking, going back to the very basic assumptions about the problems that customers are facing and the ways in which products provide solutions.

As an example, consider the process of learning a foreign language.   The traditional approach to teaching language is incrementally building vocabulary and syntax: the student learns a number of words and follows models for how they can be arranged into sentences.   This model has been used in both academic courses and professional training, and is highly ineffective: it’s hard to learn a foreign language because it is not taught very well.

For many years, there was not much improvement in teaching methods, because education providers focused on improving the parts of their pedagogy without reconsidering the entire system of teaching.   That is, those who sought to improve instruction came up with different sets of vocabulary words, tailoring courses to business discussions or the common problems of international travellers – but learning vocabulary was still a matter of rote memorization of the names of things and actions out of context.  Likewise, it was recognized that classroom education (having to be in a certain place in a certain time) was inconvenient, so lessons were recorded into books, tapes, and videos that could be portable and consumed at the learner’s convenience – but the method of teaching was still the same.

These incremental improvements addressed specific inconveniences of the learning process, but kept the process the same – and the process, itself, remained broken.  As such, learning a foreign language remained a difficult and time-consuming process, and one which most people who had a desire to learn another language chose to avoid entirely, giving up on achieving their goals.

Solving this problem, and revolutionizing the language-learning problem, required stepping back from existing practices to ask the question, “how do people best learn to speak a language?” and to consider, simply enough, the manner in which a person who moves to a foreign country learn the language without classroom instruction.

Considered in that manner, it becomes obvious that people do not memorize lists of words in isolation, but learn words that they hear in the context of everyday life.  This goes both for vocabulary (how a given thing is called) as well as syntax (how an action is described in a way that identifies who is performing it and when).   The core problem, which the system ignored, was the manner of learning, not the content of the lessons.

If I’m not mistaken, the first company to solve this was Rosetta Stone, whose courses did not consist of memorizing vocabulary words and conjugations, but instead provided users with the context of a situation in which words were used and modeled the process of remembering language in context, comparing different situations, and assimilating language as part of the communication process.  And it was wildly successful because it solved the real problem.

And again, the real problem had nothing to do with the performance of parts of the as-is process of teaching, but with the nature of the process itself.   So long as education designers focused on improving the parts while ignoring they systemic problem, no significant progress was made.  They had to start over from a blank slate.  And because their competitors remained mired in incrementally improving a broken process, Rosetta Stone constituted an amazing leap forward.

The same is likely true of a great many products, to the extent that “new and improved” has become something of a joke.  Whenever a product bears that label, customers are dubious that the improvement is significant – they often have to search for what is new and different, and are often disappointed by what it is.  A “new and improved” detergent may have a different scent, and “Version 11.1” of a software product adds features that they have no use for anyway.  

Customers have become so jaded to this practice that in order to get their attention, a product has to launch under an entirely different brand to convince them that it is really different – and even then, there is the tendency for this fact to become known and for customers to spread the word that “new brand” is exactly the same as “old brand” (even if there are some minor differences).  Novelty cannot be faked.

Starting over from a clean slate is an exceedingly difficult process because the certainty of current practices and the fear that something different will not work out inexorably bring people back to considering incremental improvements to the as-is process.   “Let’s consider a new way to X” is followed very quickly by “We’ll start by looking at what we do today and considering ways to improve it.”  Because that’s easy, and because that’s safe.

To go a bit further, I will posit that this is the reason nothing good is ever created by a committee.  When people get together in groups, the fear of the unknown becomes a constant refrain in the “innovation” process – anyone with a bold, new vision is corralled back into the herd – and herds are characterized by a desire for safety and skittishness in the face of the unknown.  


But this is a transition to a much different line of thinking.