Friday, November 30, 2012

De-badging, De-branding, Punk, and Existentialism

I saw something interesting in traffic: a de-badged Mercedes S-class.  Normally, when I see a vehicle from which the external labels have been removed, it's a crappy little import, and the assumption is the person driving it is somewhat ashamed of tooling around in one of the cheapest cars in production, what with the automobile-obsessed culture that is America.   But this guy, he got my attention and instant respect for having removed the badging of a luxury brand automobile, normally driven around by shallow egomaniacs who crave respect because of their possessions.   He even replaced the entire grill to get rid of the logo on the front.   That's hardcore.

It struck me as ironic that I would have such a reaction.  A bourgeois suburban meat-puppet with an MBA and a career of pimping brands really ought to hate the guy who has demonstrated contempt for the very thing that I spend the majority of my waking hours working to perpetuate.  But I felt, and still feel, this odd sense of respect even to the level of admiration of someone who appreciated the functional qualities of a luxury brand and yet felt no need to feed off of the brand for self-esteem.  There's a certain quiet strength in shunning the fake respect that comes from having expensive things.   That speaks to my soul.

Or more aptly, it speaks to the soul I once possessed - not the middle-aged corporate drone I am today, but the young man I was in 1980s America during the early punk movement, back when music, fashion, art, and other trappings were external tokens expressing of a body of philosophy and not just mindless conformity (it seems the cranky-old-man soul that I am growing into has put in an appearance as well).   Specifically, the core philosophy of punk, underlying the superficial trappings of music and fashion, was anti-image, anti-obsequiousness, anti-brand.

The height of punk fashion involved doing the same thing to expensive clothing that this fellow had done to an expensive automobile: ripping off the tags and emblems to demonstrate contempt for a manufactured culture that promulgated admiration for the ability to consume, rather than the ability to produce.   Of course, it could be argued that leaving the holes where the tags once were became a kind of tag unto itself - the label was gone, but the hole where the label used to be was, itself, a label representing a "brand" of a different kind.  It had meaning and it was there to be seen, conspicuously absent what would otherwise have been conspicuously present.   It's not so different after all, or at best an identical tactic for an opposing message.

It's also the very thing that signaled the end of the punk movement in America.  When the mass market of poseurs with silly-putty for brains adopted the superficial tokens, bought expensive shoes that resembled cheap ones, and pre-ripped clothing that was available for sale at considerable markup in the suburban shopping malls, those who believed in something that was more than skin-deep (or cloth-deep) found ourselves lost in a sea of mimics, who were merely conforming to a nonconformist stereotype without any clue of the ideas in the minds beneath the bad haircuts, and shortly afterward the philosophical movement beneath it all died of embarrassment.

All of this is far more autobiographical and self-obsessed than I'd care to be, in this blog-that-isn't-blog.  Or maybe I'm still trying to be conspicuously inconspicuous in my behavior.   Whatever the case, I've wandered a bit off the path and need to make an abrupt turn.  And so ...

In Being and Nothingness, Sartre discusses a couple of concepts that are germane to this rant about brands and consumerism.
  • First, he has the notion of "the look of the other," in which we recognize that other people are perceiving and making judgments about us, shocking us out of our un-self-consciousness that is the normal state in which we bumble mindlessly through life and bringing us to the realization that hell is other people (which sounds flippant, but he was dead serious). 
  • Second, he transitions this to the notion of "bad faith," in which our recognition of the fact that we are being perceived causes us to alter our behavior in response to improve or at least change the perception of ourselves.   The waiter who straightens his tie and acts more "waiterly" because customers are watching.
Whether it is conspicuous consumption or conspicuously inconspicuous consumption, being held in the look of the other causes us to live in bad faith.  As consumers, it leads us to adopt one brand and reject another - or in extreme cases of nihilistic angst, to reject all brands as an attempt to reject the very concept of brand.   But this is nonetheless the same bad faith, one of the cardinal sins of existentialism.

At yet, the point remains that bad faith is likely an inevitable facet of the human condition.  Which brings me back to branding and conspicuous consumerism: while inevitable, bad faith can at least be put to good use by consciously and willfully adapting to the look of the other.  It doesn't change the disreputable nature of living in bad faith, but reclaims some sense of integrity by making it a proactive and purposeful choice rather than reacting accidentally.

And that brings me back to where I began: it's likely not the consequences, or even the motivation, of de-badging that arouses within me this sense of profound admiration - that is to say that branding and de-branding represent different goals to which the same means are applied, with varying degrees of effectiveness.

I may be repeating myself, or perhaps deconstructing myself, so I'll knock it off and go upstairs to cut all the labels out of my suits to demonstrate my contempt for demonstrating things.

Monday, November 26, 2012

Complex Ordering

I went with a group of about sixteen people to a local restaurant for an end-of-project luncheon, and I noticed how tedious the process of ordering a meal has become.  If it were just two or four diners, it might not have as obvious, but hearing the waiter go through the process with a large group each in turn, taking over twenty minutes just to gather orders (no exaggeration - I was watching the time), the complexity of the ordering process stood out.

Aside of explaining the specials and answering questions multiple times, the process of ordering was a gantlet of questions even when it went quickly.   It took two or three iterations for me to notice this, and I started keeping count of the number of questions asked per person: it was between twelve and eighteen. Here's how my own order went ...
  1. What would you like to drink?
    Iced tea.
  2. Would you like plain, mango, raspberry, or peach tea?
  3. Sweetened or Unsweetened?
  4. Would you like sugar, equal, sweet-n-low, or splenda?
    None of the above.  (Though it's a fair question - people will order unsweetened and add sweetener themselves)
  5. What would you like?
    A New York Strip.
  6. The lunch special or the regular menu?
    The lunch special.
  7. How would you like it cooked?
    Medium Rare.
  8. That's going to have a warm red center, Is that OK?  
  9. What is your side items?
    Asparagus and a baked potato.
  10. What toppings do you want on your potato?
    Sour Cream and Chives.
  11. Would you like soup or salad?
  12. Ceasars, Wedge, Mixed Green, or Garden?
  13. What kind of dressing would you like?
    Blue Cheese.
  14. Regular or low-fat dressing?
I think this is accurate - I counted fourteen questions at the time, so if I've confabulated, it's a substitution rather than exaggeration.   I've also left out a few of the spots where I added complexity by asking questions (she asked me "what kind of tea?" and I begged her pardon, and I asked the difference between the lunch-special steak and the regular-menu one).  If ordering quickly was my primary objective, I'd have asked for a glass of water (though they can still gum that up by asking tap or bottled, then what brand of bottled) and a side order that had no options to it.

The point is, this is quite an onerous gantlet of questions for something as simple as ordering a meal.   I've designed applications for insurance products that ask fewer questions.  Granted, I'm thinking of the simpler policies, but even at that, the fact that you'd have to answer more questions to order lunch than purchase an insurance policy of any kind is totally outrageous.

And yes, I could have streamlined the process by stating "I want unsweetened iced tea, a new your strip cooked medium-rare, asparagus, a baked potato with sour cream and chives, and a garden salad with regular blue cheese dressing."   Even that seems quite a long sentence.     But I've tried that before, and it doesn't often work out: the waiter either forgets something I said or stops me mid-sentence ... or worse yet, gets something wrong.  As a customer, the best way to get what I want is to let the waiter go through the process, tedious and time-consuming as it is.

The lunch-ordering inquisition seems very clumsy and onerous from a customer experience perspective, but at the same time inevitable.   The customer is ultimately better served by getting what he wants rather than having a streamlined ordering process that might result in a less-than-satisfactory meal.  

I could likely nitpick some of the questions to effect a minor improvement.  For example, #4 could be eliminated altogether by providing a variety of sweeteners on the table (or, if the reason for not setting them out is to reduce clutter, bring them with any order of coffee or tea); #13 and #14 could be largely removed by having a default dressing on the menu and expecting customers who want something unusual to speak up; and #9 and #10 can be removed by having the chef choose sides, again trusting the customer to speak.

But by and large, complex ordering is a natural consequence of a customizable product - you don't find many restaurants that offer a single menu that will be served to all diners.  When they do, it's the chef's tasting menu or the special of the day (both of which are often used to push items for motives other than customer satisfaction, such as maximizing profit or pushing items that have been over-ordered and are about to expire).

As such, there doesn't seem to be a way to get around it.  Where customers demand a highly personalized product or service, it becomes necessary to add a question (or several) for each element for which they have a choice, resulting in an ordering process that is lengthy and complex.  I'm not keen on that at all, but for now at least I have the sense that it's just something I have to accept ... or more likely, it's something about which I will take notes and think more about it later.

Thursday, November 22, 2012

Competitive Advantage and the Product Life Cycle

As a user experience practitioner, I'm gratified and flattered by the suggestion that experience has become a source of competitive advantage; but as a rational individual, I can't accept that this is always so.  It seems to me that the competitive value of experience depends heavily upon the position of a given product in the product life cycle - specifically, that experience is a source of competitive advantage for a mature product in a mature market, but it is not necessarily an advantage at other stages.

Introduction: Availability

When a new product is minted, competitive advantage goes to the firm that makes it available. The choice customers have is to buy from the only firm that supplies it or do without because there is no other source from which it is available.

That may seem simple, but the challenge to getting customers to adopt the product is to convince them that "or do without" is a problem they really need to solve. There are many products that are aggressively promoted on late-night infomercials and home shopping channels that customers can do without.

Insofar as competitive advantage is concerned, there is no pressure on the supplier to do anything more - except the concern that other firms will enter the market and begin offering the same product, which transitions to the next phase.

Growth: Features

As demand for a product grows and more suppliers enter the market to offer it, competitive advantage often comes down to added features: you can get an alarm clock from a number of firms, but only our firm offers a combination clock-radio. When other firms add the radio feature, competitive advantage goes to the firm that offers a snooze alarm.

In essence, the nature of competition is still availability - but it's the availability of a product that has certain features that others lack. And having those features make it unique, until a competitor adds them as well. It becomes a war of ongoing escalation.

Some firms seek competitive advantage for a niche market - by adding features that are geared toward a specific purpose or need - but many simply aggregate features with a desire to accommodate every possible need and maintain as broad a market as possible.

This practice leads to feature-bloat: a product has so many unnecessary bells and whistles that it becomes difficult to use for any specific purpose, and customers get the sense they are paying extra for a lot of features they do not need.

Maturity: Experience

Where a product enters the maturity stage, products become commoditized. Customers (rightly) feel that they can get the same product with the same features from any of a number of providers at roughly the same price.

My sense is that this is the stage at which customer experience becomes a significant factor of competitive advantage. Arguably, it might have been an advantage during the growth stage - in that a given firm might see it as a way to make their product more appealing regardless of whether its features are the same as everyone else's - but it's at the maturity stage where experience becomes the only real method of competition for the majority of firms.

That is, given the homogeneity of features and pricing, customers now consider experience as a differentiating factor: they choose based on which supplier offers the easiest or most pleasant shopping/buying experience, or based on which product provides the greatest experiential (specifically, non-functional) benefits.

My sense is experience is touted in the present market because the majority of products have become stagnant. I can't think of an example of a product I own that would be much improved by additional features ... nor can I think of a product that is presently advertised based on unique features. Could be I'm not trying hard enough, but not a single example comes to mind.

Decline: Tradition?

I include "decline" by way of being comprehensive in covering the entire life cycle - but have to admit that I don't have it worked out. My sense is that a product is in decline not because people no longer have the need that it is designed to fulfill (though it seems possible that it might be so), but because an entirely new product has come along that meets the needs of the consumer in a way that the old product cannot.

It seems to me that the primary selling point for a product in decline is the mere resistance against the selling proposition of its replacement. That is, to suggest to buyers that they can do without the new product because the old product still serves their needs, possibly better and/or more cheaply. For lack of a better term, I'll call this "tradition."

My sense is also that an old product can be revived if it can add the features of a new one, or sell customers on the experiential factors of the product. But so long as sales continue to fall as customers adopt newer solutions, the product will remain in decline.


This has been a bit of a ramble, a lot of raw thoughts that need further deliberation and consideration - and I think I may be tackling another issue (feature bloat) from an oblique angle.

Sunday, November 18, 2012

Diseased Companies, Jackass Managers, and Sick Individuals

I was dismissive of Charles Sennewald's book, Traits of a Jackass Manager - it's a very thin book, lots of cartoon drawings and not much text, and most of the chapter titles seemed painfully self-evident. But I found my thoughts going back to it, and eventually picked it up and read through it cover to cover.

The book is intended to address dysfunctional behaviors in management, but as I went through I recognized that some of these behaviors are so widespread as to be part of the very fabric of certain companies, and of entire industries.    The customary tone of the superior-subordinate relationship is an indication of organizational culture - a manager is acting in accordance with the culture of the firm, or if he is not, his behavior is at the very least being tacitly condoned and accepted, though not overtly advocated (as yet).

My sense is the examination of the superior-subordinate relationship yields insight beyond the formal authority structure of an organization, as the same qualities evidence themselves in any relationship in which there is power disparity ... which is to say, in all relationships, as even those that are nominally egalitarian will find that each party has some advantage over the other in certain instances.

And from a perspective of negotiation, one of the most critical assessments in the early stages is to analyze the locus of power: to determine who thinks they have power, who actually does hold power, and to adjust your tactics accordingly.  The traits that people involved in a negotiation demonstrate reflect the degree to which that individual feels himself to be in a position of power.

The qualification "feels himself to be" is significant.  A person who lets on that they are in a position of power can be mistaken.  Such a person might also be well aware that they lack power and are overcompensating with their behavior to deceive the other person into thinking they are in a stronger position than they are.  And in either case, it can be a severe liability - just as a wrestler can use his opponent's force and weight to effect a devastating takedown, so can a skilled negotiator leverage to his own advantage the force of someone who is throwing their weight around carelessly in a metaphorical sense.

At this point, I am chasing a red herring ... I've turned away from relationship management and have begun down the course of negotiation tactics, and have to stagger a bit awkwardly back ...

The tone that a person takes in interacting with others reflects their sense of power in regard to the other party.   This is applicable to  manager-subordinate relationships as the author describes - but can be extended to service relationships, in which the provider and the customer each feel themselves to have a source of power, and the negotiation is influenced by this belief.     That is,  a boss who feels himself to be in a position of complete power over his subordinates in prone to behave abominably, and so does a firm that feels itself to be in a position of complete power over the market.

It's curious that such misbeliefs are perpetuated to the present day: if it were not so, there would be no market for a book of this nature - nor any need for the volumes of information written to guide firms and individuals in interacting with customers.  It should go without saying that bigotry, despotism, manipulation, autocracy, and the like are detrimental to morale and performance alike - but if it really were self-evident, would these behaviors really be so widespread?

Wednesday, November 14, 2012

Technology Immortalizes Bad Practices

For all the benefits that technology can bring to an organization that adopts it to automate tasks, it has a smothering effect on innovation.  This likely elicits a incredulous "harrumph" from those who are of the opinion that technology alone is innovation - but there's a good case that it isn't, and that it is most often used to ensconce bad practices and prevent companies from being innovative.

Consider first that technology is used to replace processes without changing them in any significant way.   At the onset, a paper-based system of storing and managing  information was automated by technology, but the fundamental process by which the information was collected, organized, and accessed was carried over rather than reconsidered in light of the different, and in most cases superior, capabilities of technology.  

Has it never struck you as slightly idiotic that personal computers still mimic the artifacts of the ink-and-paper world?   Why is it that a word-processing file is represented by an icon that looks like a typewritten letter?   Why should data be stored in spreadsheets that look suspiciously like paper ledgers and graphical representations of database records look like 3-by-5 index cards or paper forms?   Why should "files" be stored in "folders" and the icon for a file server look suspiciously like a filing cabinet?

Certainly, when the graphical user interface first appeared on personal computers in 1984, it could be argued that representing computer files as if they were paper files was a friendly paradigm for office workers who felt reluctant to adopt the new technology.  But nearly thirty years later, we're still locked in a paradigm that content is stored in documents that must still be treated in the same way as paper artifacts that fewer people in each generation of workers have a memory of ever handling, which prevents us from even considering information in any other way.

It's a pernicious problem on a larger scale, when you consider that in order to make and pay for a purchase, you must still fill out a requisition, that is approved by one department, which creates a purchase order, which is compared to the requisition before being sent to another department, which places the order with a vendor, who ships the order with an invoice, which must be reconciled against the purchase order in one department and the original requisition form in another.   While technology has digitized the paper and transmits the data through cables instead of vacuum tubes, it's still fundamentally the same process with all its redundancies and checkpoints that has been in existence since the nineteenth century.

This entire nonsensical system exists because the purchasing process itself has remained the same.   Computers replaced paper files with digital ones and transmits data over cables instead of vacuum tubes, the same way that typewritten orders replaced those written on parchment with ostrich feathers to be carried about by human runners.   Never once in all this time has it been questioned whether the process makes sense, or if a lot of the busyness of business could be reduced by reconsidering the necessity of the process in light of the capabilities of the technology with which it is executed.

This is the basis of my argument that technology alone isn't innovation: it makes the tasks we do faster and less cumbersome, but it doesn't change the pointless ritual of the process itself.  True innovation would not merely automate an existing way of doing things, but change the way things are done, eliminating busywork, redundancies, and the expense and delay that they cause.  But this didn't happen when technology was introduced - it merely mimicked the old way of doing things.

The second point of my argument is that technology smothers innovation, and immortalizes bad practices.   And that's a boarder argument - but one that seems sensible to anyone who has changed jobs and spent a significant amount of time having to be trained, or discover by trial-and-error, how "the system" works and the ridiculously complex things they have to do to perform a simple task.   And if you suggest there might be a better or more efficient way, you hear "the system won't let you do that."

It's paralyzing in the course of trying to implement process improvements by modifying existing information systems to eradicate some of the fuss and bother.   Why do we ask the person to enter the same data twice in the course of a single task?  Why do we ask for some information on screen five and other information on screen three when the two or closely related?    Because that's the way the back-end system is designed to work.   And why can't we change it?   Because it would be difficult and expensive to do so.

And that, I believe, is a serious issue for many firms who have lived for many years with information systems that has prevented any deviation from processes that have not changed in decades: even if they recognize that there would be greater expediency and efficiency in changing their business processes, they are being held back by archaic paradigms they are forced to accept by adopting technology that mimics traditional and entirely unnecessary processes.

In fairness, this has been a long rant by someone who has of late run into one too many brick walls and smeared lipstick on one too many pigs ... but I can't help the sense that the frustration that I feel at this moment is entirely warranted, and the blithe complacency to accept the limitations of outdated information technology is smothering innovation and progress.

And as much as I love to smirk at the irony that this has all been brought about by the very technology that promised to move us forward, I just can't muster the enthusiasm.  So I rant.  Isn't that what blogs are for?

Saturday, November 10, 2012

Division or Unification of Labor

I heard a fairly convincing two-part argument against Smith's principle of the division of labor.   I use the qualifier of "fairly" because I don't think that it entirely disproves the theory, nor does it replace it, but it does cast some doubt as to whether it is universally applicable.

Division of Labor

As a refresher on the original principle: Smith suggested that dividing a task into a number of simpler tasks, and training workers to perform only a small step in an overall process, enabled considerably greater production with considerably less skilled hands.  It's the theory that was best demonstrated by assembly-line production, and the astounding level of output that could be achieved.

Smith's example was that of a pin factory: each worker performs a simple task: one draws the wire, another cuts it, another straightens it, another grinds the head, and another sharpens the point.  And by this method, a small group of illiterate farmhands could, with very little training, far outproduce an equal number of master blacksmiths.

Strictly speaking, this was not the origin of the division of labor.  Even before Smith's theory was put into practice, separate tradesmen mined the ore, smelted the metal, and fashioned it into useful objects.   Blacksmiths also specialized according to the kinds of objects they were most skilled in fashioning, and there were separate "smiths" for various kinds of metal.    So in essence, Smith was merely observing an existing phenomenon, but he took it to the extreme.

First Rebuttal: Divisibility

The first rebuttal to Smith's theory draws on an example that seems patently ludicrous - but which seems no less extreme than the example proposed by Smith: that of typing a letter by using fifty or so workers, each of whom uses a machine that has a single key: one person types a lowercase "e", another an uppercase "T", another a comma, another the spacebar, etc.

This follows the very principle of the division of labor, and illustrates the dysfunction of taking it to extremes.  The amount of time to load and position the paper, strike the key, remove the paper, and transport it to the next workstation overwhelms any benefit of specialization, and requires a tremendous amount of coordination to move it from one worker to the next in proper order.

Plainly, the task can be performed most efficiently by a single skilled worker than a team of unskilled workers, and it makes little sense (and much waste) to subdivide it.  As such, the unification of the tasks involved into one skilled typist makes far more sense than the division of labor among many unskilled hands.

And to that point, the unification of labor is taken further by the elimination of the typist - as it is presently common for the author of a letter to do his own typing, and likely far more efficient, as the speed at which keys can be struck is of relatively minor effect.

Second Rebuttal: Skilled Labor

A second rebuttal, more philosophical than functional, is that Smith's theory was created at a time when skilled labor was in short supply: it was the perfect solution to the problem of manufacturing when there was not a ready supply of skilled blacksmiths and illiterate laborers had to be trained to perform tasks in a short amount of time (five years of apprenticeship was not feasible).

But in the present day, most workers are at least semi-literate and considerably more intellectually sophisticated than their eighteenth-century counterparts.  So it lo longer seems necessary or desirable to break down a task into functions that are so simple that they could be performed by a trained chimp or replaced by a mechanical device that repeatedly performs the same sequence of basic motions.

It's also worth considering that the manufacturing of goods has become a relatively minor part of the present economy.   Most workers in the post-industrial world are involved in what is generally called "knowledge work," in which the intellectual component is of far greater importance in the generation of value than the manual tasks involved in producing physical artifacts.

Many of the most highly-compensated professions produce no physical artifact at all: a doctor, an attorney, and an executive do not (directly) produce any physical object.  And neither is this work adaptable to the division of labor beyond a certain degree.   It can be argued that the work of a physician has been divided into specialized forms of medicine (podiatrists, oncologists, cardiologists, etc.) and that some of the tasks are handed off to labs or clinics (blood work, X-rays, etc.), but there remains the need for a general practitioner to manage the care of a given patient in a holistic manner, calling in specialists when it is beneficial.

In more abstract terms, an intelligent person is capable of performing more sophisticated tasks, and more tasks, and there is not a practical need to subdivide labor.  In addition to being counterproductive, it is patently unnecessary in a culture in which there is ready availability of an educated workforce.

Reconciling the Extremes

Each of these arguments takes an extreme position - the complete division of a process into minute tasks, or its complete unification onto a single workbench.  As in many things, my sense is that neither extreme is right, but represent the ends of a continuum along which an intelligent choice must be made.

When a task is too divided, there arises a need for administration and control, and the number of errors as a result of miscommunication and misdirection rise.  I vaguely recall an article that suggested it can be assessed by the ratio of non-productive employees who supervise and audit work as compared to the number of employees who are directly productive.   Where that ratio is less than 1-to10, the overhead cost of administration likely exceeds the efficiencies of division of labor.   Seems a bit arbitrary, but likely a good indicator that consideration is necessary.

When a task is too unified, productivity decreases.  I don't expect the argument of an educated workforce holds much weight here: I don't think education would have been a factor in Smith's pin-factory example.  The most educated and skilled blacksmith would still be capable of producing far fewer pins than an assembly-line of uneducated workers.  I can recall no guidance on this issue, but it seems to me that it has much to do with task-switching, and perhaps the same ratio of 1-to10 could be used to assess whether the nonproductive time of a given worker (switching from one task to the next) as compared to the productive time (performing a given task) likely exceeds the efficiency that can be gained by specialization, when considering both the time to move work from one person to the next as well as the overhead costs of coordination and control.

In terms of managing knowledge-workers, the educational factor likely comes into play.   While some individuals claim to be able to perform all tasks related to a given act of production with expert precision, it is very seldom true.   I've found this to be the case in print and Web production, where individuals arrogantly claim to be experts at every facet - only to find that they are very good at one or two things, marginally competent in a few others, and woefully inadequate at the rest.  There does seem to be a finite number of things a person can do very well, and when a task requires expertise in too many dimensions, unification of labor produces a low-quality and low-volume result.

As such, it seems necessary to determine the point between the extremes for any given task or process: division or unification of labor to the logical extreme is not a panacea for all forms of work - but instead, the idiosyncratic nature of the task, and the idiosyncratic capabilities of the individuals who perform it, must be intelligently assessed to arrive at a point where productivity, cost, and quality have achieved an optimum balance.

Tuesday, November 6, 2012

Culture and Service Expectations

I was taken aback by an study I read that indicated that customers in the south are much more fastidious, impatient, and demanding in their relationships to service providers than are people in the north ... until I noticed that the article was referring to the north and south of England.

The situation there is much the opposite to the (eastern) United States, where the north is urbanized and densely populated than the south and people are a lot less gentle with one another - but if you flip the poles, the situation is likely the same: people who live in urban environments tend to lead more hectic lives, and are far less civil to one another, and likely carry those cultural proclivities over to their relationships with the companies that serve them.

Likely, it would be fair to say "cultural stereotypes" rather than "cultural proclivities" because the north/south dichotomy is a broad generalization. There's a dramatic difference between the personality traits of an individuals who live in Manhattan and those who live "upstate" in a less densely populated area - though both are in the north, the latter are more easygoing, congenial, and almost "southern" in the way they interact with other people.

This got me to thinking that there are a number of factors in the culture of the customer (as opposed to the culture of an organization) that have a dramatic impact on the perception of customer service. In general, the way in which a person is inclined to interact with other people in society will influence the way in which he interacts with them in his role as an employee in a business or customer situation.

That is, an impatient, rude, obnoxious, and self-centered person is going to be a difficult customer to please.   He will be more critical of flaws, less appreciative of extra efforts, and more surly and difficult in general.  Aside of individual psychology, my sense is that there are objective and measurable environmental factors that will lead a person to exhibit this personality.

The most germane factor to the study that spun up this train of thought is location - though it's not as simple as latitude, it likely is highly correlated to the population density in the locations where a person lives and works that influences his perspective: people whose daily lives are conducted in densely-populated areas are far more irritable in any situation in which they are faced with human interact than are those who live in areas of lower population density.

Age might also be seen to have an impact, though it's difficult to say whether younger customers would be inclined to be more peevish, self-centered, and unreasonable than older ones. There is the stereotype of elderly people being cantankerous and cranky, and there are no stereotypes without prototypes - and plenty of them - but at the same time, younger customers are peevish and petulant as well.

The more I think on it, the more it seems that this is more a matter of sociology than customer service - though it certainly impacts the latter - and there's likely no shortage of data on the subject based on more formal research than an individual's speculation ... so I'll stop speculating, but preserve this note for future consideration, as much of my ruminating considers the culture of the organization rather than that of the customer, which is likely as important if not more.

Friday, November 2, 2012

Media Clutter

My curiosity has been piqued by a new, or at least unusual, method of in-store advertising: floor stickers, particularly in grocery stores. If you've not seen one (as I hadn't until recently), it's a sticker that's about the size of a welcome mat that contains a promotional message for a product, with an accompanying train of footprint-shaped stickers (in the same color as the promotion) leading from the sticker at the end of the aisle to the shelf where the product is located.

What makes this unusual is that the floor of a retail store - the physical surface beneath the shoppers' feet - is presently a blank canvas. Some marketing enthusiast recognized it as a space he could leverage to grab attention and pimp his wares. Granted, there's some consumer resistance to the increase of promotional messaging, every available inch of space seems to be plastered with a clutter of ads - but the novelty makes this technique particularly effective.

I paused to watch a few customers, who noticed the sticker, followed the footprints to the shelf, and inspected the item being promoted. None of them actually bought it. I expect that my study of half a dozen people in one location isn't sufficient to draw firm conclusions, but it did seem highly effective: everyone seemed to notice the ad, and about a third went to the shelf to inspect the item. That's not statistically sound, but it does seem to indicate merit.

A floor sticker is also a tricky problem for functional design: it has to stand out against whatever floor it is affixed to. It must stand up to wear and stains that occur from people walking on it (and being mopped each night), but yet not be so slippery that it constitutes a hazard. The adhesive has to be strong enough to hold it to the floor, but not leave a residue when it comes time to remove it, etc. But this is likely a digression into minor tactical concerns.

What makes this mode of promotion so (presumably) effective is that it is unique. It is the only advertisement on the floor - one message in an uncluttered expanse automatically gets attention. Will it be as effective when there are two? Will it be as effective when there are six of them on each end of the aisle, with a patchwork of interweaving footprint trails from one end of the aisle to another?  I expect not.

And this is true of advertising in all media: it is effective when it is new precisely because nobody else is doing it. If there were but one billboard on a stretch of road, chances are people would notice and pay attention. It's likely a matter of imitative evolution: the first billboard got attention and effectively increased revenue. Someone else saw the idea and copied it, and then there were two. Years later, the roadside is so cluttered with billboards that drivers no longer pay attention, and even complain about it being a distraction and an eyesore, and pundits declare roadside advertising is ineffective.

And this is where I take issue with blanket statements that a given kind of advertising "does not work."   It works, and it works very well, until so many companies attempt to do the same thing that it becomes clutter. People would not like music concerts if there were six bands playing at the same time, each one trying to play louder than the rest - it would be a cacophony of noise, an incomprehensible nuisance. But it would be wrong to conclude that people don't like music and find it annoying - they would like it, and give it their attention, but for the clutter.

But to drag myself back to the specific topic at hand (floor advertising), it will be interesting to see, in ten years, what has become of it. Will the idea catch on at all? Will it become so ubiquitous that customers ignore it? For most forms of advertising, few can remember when it was just getting started and the field was uncluttered - and I look to this as an idea that can be watched from the early stages of its evolution.