dw2

9 April 2012

Six weeks without Symbian

Filed under: Accenture, Android, Apple, applications, Psion, Samsung, smartphones, Symbian, UIQ — David Wood @ 10:58 am

It’s only six weeks, but in some ways, it feels like six months. That’s how much time has passed since I’ve used a Symbian phone.

These six weeks separate me from nearly thirteen years of reliance on a long series of different Symbian phones. It was mid-1999 when prototype Ericsson R380 smartphones became stable enough for me to start using as my regular mobile phone. Since then, I’ve been carrying Symbian-powered smartphones with me at all times. That’s thirteen years of close interaction with various Symbian-powered devices from Nokia, Ericsson (subsequently Sony Ericsson), and Samsung – interspersed with shorter periods of using Symbian-powered devices from Panasonic, Siemens, Fujitsu, Sendo, Motorola, and LG.

On occasion over these years, I experimented with devices running other operating systems, but my current Symbian device was never far away, and remained my primary personal communication device. These non-Symbian devices always left me feeling underwhelmed – too much functionality was missing, or was served up in what seemed sub-optimal ways, compared to what I had learned to expect.

But ahead of this year’s Mobile World Congress in Barcelona, held 27th Feb to 1st Mar, I found three reasons to gain a greater degree of first-hand experience with Android:

  1. I would be meeting representatives of various companies who were conducting significant development projects using Android, and I wished to speak from “practical knowledge” rather than simply from “book knowledge”
  2. Some of my colleagues from Accenture had developed apps for Android devices, that I wanted to be able to demonstrate with confidence, based on my own recurring experience of these apps
  3. One particular Android device – the Samsung Galaxy Note – seemed to me to have the potential to define a disruptive new category of mobile usage, midway between normal smartphones and tablets, with its radically large (5.3″) screen, contained in a device still light enough and small enough to be easily portable in my shirt-top pocket.

I was initially wary about text entry on the Galaxy Note. My previous encounters with Android devices had always left me frustrated when trying to enter data, without the benefits of a QWERTY keyboard (as on my long-favourite Nokia E6 range of devices), or fluid hand-writing recognition (as on the Sony Ericsson P800/P900/P910).

But in the course of a single day, three separate people independently recommended me to look at the SwiftKey text entry add-on for Android. SwiftKey takes advantage of both context and personal history to predict what the user is likely to be typing into a given window on the device. See this BBC News interview and video for a good flavour of what SwiftKey provides. I installed it and have been using it non-stop ever since.

With each passing day, I continue to enjoy using the Galaxy Note, and to benefit from the wide ecosystem of companies who create applications for Android.

Here’s some of what I really like about the device:

  • The huge screen adds to the pleasure of browsing maps (including “street view”), web pages, and other graphic, video, or textual content
  • Time and again, there are Android apps available that tailor the mobile user experience more closely than web-browsing alone can achieve – see some examples on the adjacent screenshot
  • These apps are easy to find, easy to install, and (in general) easy to use
  • Integration with Google services (Mail, Maps, etc) is impressive
  • I’ve grown to appreciate the notification system, the ubiquitous “back” button, and the easy configurability of the device.

On the other hand, I’m still finding lots of niggles, in comparison with devices I’ve used previously:

  • It’s hard to be sure, but it seems likely to me that I get a working network connection on the device less often than on previous (e.g. Nokia) devices. This means for example that, when people try to ring me, it goes through to my voice mail more often than before, even though my phone appears (to my eyes) to be working. I’m finding that I reboot this device more often than previous devices, to re-establish a working network connection
  • I frequently press the “back” button by accident, losing my current context, for example when turning the phone from portrait to landscape; in those moments, I often silently bemoan the lack of a “forward” button
  • The device is not quite capable of one-handed use – that’s probably an inevitable consequence of having such a large screen
  • Although integration with Google services is excellent, integration with Outlook leaves more to be desired – particularly interaction with email notifications of calendar invites. For example, I haven’t found a way of accepting a “this meeting has been cancelled” notification (in a way that removes the entry from my calendar), nor of sending a short note explaining my reason for declining a given meeting invite, along with the decline notification, etc
  • I haven’t gone a single day without needing to recharge the device part-way through. This no doubt reflects my heavy use of the device. It may also reflect my continuing use of the standard Android web browser, whereas on Symbian devices I always quickly switched to using the Opera browser, with its much reduced data transfer protocols (and swifter screen refreshes)
  • Downloaded apps don’t always work as expected – perhaps reflecting the diversity of Android devices, something that developers often remark about, as a cause of extra difficulty in their work.

Perhaps what’s most interesting to me is that I keep on enjoying using the device despite all these niggles. I reason to myself that no device is perfect, and that several of the issues I’ve experienced are problems of success rather than problems of failure. And I continue to take pleasure out of interacting with the device.

This form factor will surely become more and more significant. Up till now, Android has made little market headway with larger tablets, as reported recently by PC World:

Corporations planning tablet purchases next quarter overwhelmingly voted for Apple’s iPad, a research firm said Tuesday [13th March]

Of the 1,000 business IT buyers surveyed last month by ChangeWave Research who said they would purchase tablets for their firms in the coming quarter, 84% named the iPad as an intended selection.

That number was more than ten times the nearest competitor and was a record for Apple.

However, Samsung’s success with the “phablet” form factor (5 million units sold in less than two months) has the potential to redraw the market landscape again. Just as the iPad has impacted people’s use of laptops (something I see every day in my own household), the Galaxy Note and other phablets have the potential to impact people’s use of iPads – and perhaps lots more besides.

Footnote 1: The Galaxy Note is designed for use by an “S Pen Stylus”, as well as by finger. I’ve still to explore the benefits of this Stylus.

Footnote 2: Although I no longer carry a Symbian smartphone with me, I’m still utterly reliant on my Psion Series 5mx PDA, which runs the EPOC Release 5 precursor to Symbian OS. I use it all the time as my primary Agenda, To-do list, and repository of numerous personal word documents and spreadsheets. It also wakens me up every morning.

Footnote 3: If I put on some rosy-eyed glasses, I can see the Samsung Galaxy Note as the fulfilment of the design vision behind the original “UIQ” device family reference design (DFRD) from the early days at Symbian. UIQ was initially targeted (1997-1999, when it was still called “Quartz”) at devices having broadly the same size as today’s Galaxy Note. The idea received lots of ridicule – “who’s going to buy a device as big as that?” – so UIQ morphed into “slim UIQ” that instead targeted devices like the Sony Ericsson P800 mentioned above. Like many a great design vision, UIQ can perhaps be described as “years ahead of its time”.

1 April 2012

Why good people are divided by politics and religion

Filed under: books, collaboration, evolution, motivation, passion, politics, psychology, RSA — David Wood @ 10:58 pm

I’ve lost count of the number of people who have thanked me over the years for drawing their attention to the book “The Happiness Hypothesis: Finding Modern Truth in Ancient Wisdom” written by Jonathan Haidt, Professor of Social Psychology at the University of Virginia. That was a book with far-reaching scope and penetrating insight. Many of the ideas and metaphors in it have since become fundamental building blocks for other writers to use – such as the pithy metaphor of the human mind being divided like a rider on an elephant, with the job of the rider (our stream of conscious reasoning) being to serve the elephant (the other 99% of our mental processes).

This weekend, I’ve been reading Haidt’s new book, “The Righteous Mind: Why Good People Are Divided by Politics and Religion”. It’s a great sequel. Like its predecessor, it ranges across more than 2,400 years of thought, highlighting how recent research in social psychology sheds clear light on age-old questions.

Haidt’s analysis has particular relevance for two deeply contentious sets of debates that each threaten to destabilise and divide contemporary civil society:

  • The “new atheism” critique of the relevance and sanctity of religion in modern life
  • The political fissures that are coming to the fore in the 2012 US election year – fissures I see reflected in messages full of contempt and disdain in the Facebook streams of some several generally sensible US-based people I know.

There’s so much in this book that it’s hard to summarise it without doing an injustice to huge chunks of fascinating material:

  • the importance of an empirical approach to understanding human morality – an approach based on observation, rather than on a priori rationality
  • moral intuitions come first, strategic reasoning comes second, to justify the intuitions we have already reached
  • there’s more to morality than concerns over harm and fairness; Haidt memorably says that “the righteous mind is like a tongue with six taste receptors”
  • the limitations of basing research findings mainly on ‘WEIRD‘ participants (people who are Western, Educated, Industrialised, Rich, and Democratic)
  • the case for how biological “group selection” helped meld humans (as opposed to natural selection just operating at the level of individual humans)
  • a metaphor that “human beings are 90 percent chimp and 10 percent bee”
  • the case that “The most powerful force ever known on this planet is human cooperation — a force for construction and destruction”
  • methods for flicking a “hive switch” inside human brains that open us up to experiences of self-transcendence (including a discussion of rave parties).

The first chapter of the book is available online – as part of a website dedicated to the book. You can also get a good flavour of some of the ideas in the book from two talks Haidt has given at TED: “Religion, evolution, and the ecstasy of self-transcendence” (watch it full screen to get the full benefits of the video effects):

and (from a few years back – note that Haidt has revised some of his thinking since the date of this talk) “The moral roots of liberals and conservatives“:

Interested to find out more? I strongly recommend that you read the book itself. You may also enjoy watching a wide-ranging hour-long interview between Haidt and Robert Wright – author of Nonzero: The Logic of Human Destiny and The Evolution of God.

Footnote: Haidt is talking at London’s Royal Society of Arts on lunchtime on Tuesday 10th April; you can register to be included on the waiting list in case more tickets become available. The same evening, he’ll be speaking at the Royal Institution; happily, the Royal Institution website says that there is still “good availability” for tickets:

Jonathan Haidt, the highly influential psychologist, is here to show us why we all find it so hard to get along. By examining where morality comes from, and why it is the defining characteristic of humans, Haidt will show why we cannot dismiss the views of others as mere stupidity or moral corruption. Our moral roots run much deeper than we realize. We are hardwired not just to be moral, but moralistic and self-righteous. From advertising to politics, morality influences all aspects of behaviour. It is the key to understanding everybody. It explains why some of us are liberals, others conservatives. It is often the difference between war and peace. It is also why we are the only species that will kill for an ideal.

Haidt argues we are always talking past each other because we are appealing to different moralities: it is not just about justice and fairness – for some people authority, sanctity or loyalty are more important. With new evidence from his own empirical research, Haidt will show it is possible to liberate us from the disputes that divide good people. We can either stick to comforting delusions about others, or learn some moral psychology. His hope is that ultimately we can cooperate with those whose morals differ from our own.

Discovering and nourishing an inner ‘Why’

Filed under: books, challenge, Energy, films, leadership, marketing, motivation, passion, psychology — David Wood @ 1:21 am

Where does the power come from, to see the race to its end?

In the 2012 year of London Olympics, the 1981 film “Chariots of Fire” is poised to return to cinemas in the UK, digitally remastered. As reported by BBC News,

The film tells the true story of two runners who compete in the 1924 Paris Olympics despite religious obstacles.

It will be shown at more than 100 cinemas around the country from 13 July as part of the London 2012 Festival.

Starring Ian Charleson and Ben Cross, the film won four Oscars, including best picture, screenplay and music for Vangelis’ acclaimed score.

Although the film is 31 years old, producer Lord Puttnam believes the message is still relevant.  “Chariots of Fire is about guts, determination and belief…” he said.

This is a film about accomplishment against great odds. More than that, it’s a film about motivation that can enable great accomplishment. The film features athletics, but the message applies much more widely – in both business life and personal life.

I vividly remember watching the film in its opening night in Cambridge in 1981, and being so captivated by it that I returned to the cinema the following evening to watch it again. One part that has wedged deep in my mind is the question I’ve placed at the top of this article, which comes from a sermon preached by Eric Liddell, one of the athletes featured in the movie:

Running in a race… is hard. It requires concentration of will. Energy of soul… Where does the power come from, to see the race to its end? From within.

Liddell’s own answer involved his religious faith, including following the principle that forbade playing sport on Sundays. Viewers can take inspiration from the film, without necessarily sharing Liddell’s particular religious views. The general point is this: Lasting personal strength arises from inner conviction.

Anyone watching the film is implicitly challenged: do we have our own inner basis for lasting personal strength? Do we have a ‘Why’ that gives us the power to pick ourselves up and continue to shine, in case we stumble in the course of our own major projects? Indeed, do we have a ‘Why’ that inspires not only ourselves, but others too, so that they wish to work with us or share our journey through life?

In similar vein, the renowned writer about personal effectiveness, Stephen Covey, urges us (in his celebrated book “The 7 Habits of Highly Effective People”) to Begin with the end in mind and to Put first things first:

Are you–right now–who you want to be, what you dreamed you’d be, doing what you always wanted to do? Be honest. Sometimes people find themselves achieving victories that are empty–successes that have come at the expense of things that were far more valuable to them. If your ladder is not leaning against the right wall, every step you take gets you to the wrong place faster…

To live a more balanced existence, you have to recognize that not doing everything that comes along is okay. There’s no need to over-extend yourself. All it takes is realizing that it’s all right to say no when necessary and then focus on your highest priorities…

I was recently reminded of both Chariots of Fire and Stephen Covey when following up an assignment given to me by a personal coach. The assignment was to view the TED video “How great leaders inspire action” by Simon Sinek:

This talk features high on the page of the TED talks rated by viewers as the most inspiring. Watch the video and this high placement won’t be a surprise to you. I liked the video so much that I downloaded the audio book the talk is based on: “Start with Why: How Great Leaders Inspire Everyone to Take Action”. I’ve been listening to it while walking to/from work over the last few days. It’s been both profound and challenging.

Sinek’s central message is this:

People don’t buy ‘What’ you do, they buy ‘Why’ you do it.

To back up this message, Sinek tells a host of fascinating tales. He offers lots of contrasts, between individuals (or companies) that had a clear, inspiring sense of purpose (their ‘Why’), and those that instead became bogged down in the ‘What’ or the ‘How’ of their work. The former generated loyalty and passion – not so the latter. Examples of the former include Southwest Airlines, Harley Davidson, Starbucks, the Wright Brothers, Martin Luther King, and Apple. He also gives examples of companies that started off with a clear sense of purpose, but then lost it, for example due to changes in leadership, when an operational leader took over the reins from an initial inspirational leader.

Sinek repeatedly contrasts “inspiration” with “manipulation”. Manipulation includes both carrots and sticks. Both inspiration and manipulation can lead to people doing what you want. But only the former can be sustained.

One vivid example covered by Sinek was the leadership of Sir Ernest Shackleton of the 1914-16 Trans-Antarctic Expedition. According to Sinek, Shackleton gathered crew members for this expedition by placing the following advertisement in the London Times:

Men wanted for hazardous journey. Small wages. Bitter cold. Long months of complete darkness. Constant danger. Safe return doubtful. Honour and recognition in case of success. —Ernest Shackleton.

Another of Sinek’s example is how the Wright Brothers succeeded in achieving the first powered flight, beating a team that was much better funded and seemed to be better placed to succeed, led by Professor Samuel Pierpont Langley.

In Sinek’s view, it’s not a matter of having energy, or skill, or financing; it’s a matter of something deeper. It might be called ‘charisma’, or ’cause’:

Charisma has nothing to do with energy; it comes from a clarity of ‘Why’. It comes from absolute conviction in an ideal bigger than oneself. Energy, in contrast, comes from a good night’s sleep or lots of caffeine. Energy can excite. But only charisma can inspire. Charisma commands loyalty. Energy does not.

Energy can always be injected into an organization to motivate people to do things. Bonuses, promotions, other carrots and even a few sticks can get people to work harder, for sure, but the gains are, like all manipulations, short-term. Over time, such tactics cost more money and increase stress for employee and employer alike, and eventually will become the main reason people show up for work every day. That’s not loyalty. That’s the employee version of repeat business. Loyalty among employees is when they turn down more money or benefits to continue working at the same company. Loyalty to a company trumps pay and benefits. And unless you’re an astronaut, it’s not the work we do that inspires us either. It’s the cause we come to work for. We don’t want to come to work to build a wall, we want to come to work to build a cathedral.

There’s a bit too much repetition in the book for my liking, and some of the stories in it can be questioned (for example, the advertisement supposedly placed by Shackleton is probably apocryphal).

But the book (like the TED video) has a tremendous potential to cause people to rethink their own personal ‘Why’. Without clarity on this inner motivation, we’re likely to end up merely going through the motions in activities. We might even seem, from outside, to have many achievements under our belts, but we will (to return to Stephen Covey’s analogy) have climbed a ladder leaning against the wrong wall, and we’ll lack the power to inspire the kind of action we truly want to see.

I’ll finish with a few thoughts on what I perceive as my own ‘Why’ – To enable the widespread radically beneficial application of technology:

Technology, deployed wisely, can do wonders to improve the everyday lives of humans everywhere. But technology also has the potential to do very serious damage to human well-being, via unintended disruptions to the environment and the economy, and by putting fearsome weapons in the hands of malcontents.

As a technology super-convergence accelerates over the next 10-20 years, with multiple hard-to-predict interactions, the potential will intensify, both for tremendously good outcomes, and for tremendously bad outcomes. We can’t be sure, but what’s at risk might be nothing less than the survival of humanity.

However, with the right action, by individuals and communities, we can instead witness the emergence of what could be called “super-humanity” – enabled by significant technological enhancements in fields such as synthetic biology, AI, nanotechnology, and clean energy. Progress in these fields will in turn be significantly impacted by developments in the Internet, cloud computing, wireless communications, and personal mobile devices – developments that will ideally result in strong positive collaboration.

The stakes are sky high. We’re all going to need lots of inner personal strength to steer events away from the looming technology super-crisis, towards the radically beneficial outcome that beckons. That’s a cause worthy of great attention. It’s a race that we can’t afford to lose.

26 March 2012

Short-cuts to sharper thinking?

Filed under: bias, futurist, intelligence, nootropics — David Wood @ 11:15 pm

What are the best methods to get our minds working well? Are there ways to significantly improve our powers of concentration, memory, analysis, and insight?

Some methods for cognitive enhancement are well known:

  • Get plenty of sleep
  • Avoid distracting environments
  • Practice concentration, to build up mental stamina
  • Augment our physical memories with external memories, whether in physical or electronic format, that we can consult again afterwards
  • Beware the sway of emotion – “when your heart’s on fire, smoke gets in your eyes”
  • Learn about cognitive fallacies and biases – and how to avoid them
  • Share our thinking with trusted friends and colleagues, who can provide constructive criticism
  • Listen to music which has the power both to soothe the mind and to stimulate it
  • Practice selected yoga techniques, which can provide a surge of mental energy
  • Get in touch with our “inner why”, that rekindles our motivation and focus.

Then there are lots of ideas about food and drink to partake, or to avoid. Caffeine provides at least a transient boost to concentration. Alcohol encourages creativity but weakens accurate discernment. Sugar can provide a short-term buzz, though (perhaps) at the cost of longer-term sluggishness. Claims have been made for ginseng, ginkgo biloba, ginger, dark chocolate, Red Bull, and many other foods and supplements.

But potentially the most dramatic effects could result from new compounds – compounds that are being specially engineered in the light of recent findings about the operation of the brain. The phrase “smart drugs” refers to something that could dramatically boost our mental powers.

Think of the character Eddie in the film Limitless, and of the mental superpowers he acquired from NZT, a designer pharmaceutical.

If a real-world version of NZT were offered to you, would you take it?

(Note: NZT has its own real-world website – which is a leftover part of a sophisticated marketing campaign for Limitless.)

I foresee four kinds of answer:

  1. No such drug could be created. This is just fiction.
  2. If such a drug existed, there would be risks of horrible side-effects (as indeed – spoiler alert! – happened in Limitless). It would be foolish to experiment.
  3. If such a drug existed, it would be immoral and/or inappropriate to take it. It’s unfair to short-circuit the effort required to actually make ourselves mentally sharper.
  4. Sure, bring it to me! – especially for mission-critical situations like major exams, job interviews, client bid preparation, project delivery deadlines, and for those social occasions when it’s particularly important to make a good impression.

My own answer: even though nothing as remarkable as NZT exists today, drugs with notable mental effects are going to become increasingly available over the next decade or so.  As well as being more widely available, the quality and reliability will increase too.

So we’re likely to be hearing more and more of the phrases “cognitive enhancers”, “smart drugs”, and “nootropics“.  We’ll all going to have to come to terms with weighing up the pros and cons of taking these enhancers.  And we’ll probably need to appreciate many variations and special cases.

Yes, there will be risks of side effects.  But it’s the same with other drugs and dietary supplements.  We need to collect and sift evidence, as it is most likely to apply to us.

For example: on the advice of my doctors, I take a small dose of aspirin every evening, and a statin.  These drugs are known to have side-effects in some cases.  So my GP ensured that I had a blood test after I’d been taking the statins for a while, to check there were no signs of the most prevalent side-effect.  In due course, genomic sequences might identify which people are more susceptible to particular side-effects.

Similarly with nootropics: the best effects are likely to arise from tailoring doses to the special circumstances of individual people, and to monitoring for unusual side effects.

There’s already lots of information online about various nootropics.  For example, see this Nootropics FAQ.  That’s a lot to take in!

Personally, for the next few years, I expect to continue to focus my own cognitive enhancement project on the methods I listed at the start of this article.  But I want to keep myself closely informed about developments in nootropics.  If the evidence of substantive beneficial effect becomes clearer, I’ll be ready to take full advantage.

Hmm, the likelihood is that I’m going to need to become smarter, in order to figure out when it’s wise to try to make myself smarter again by taking one or more nootropics.  But that first-stage mental enhancement can happen by immersing myself in a bunch of other smart people…

That’s one reason I’m looking forward to the London Futurist Meetup on the subject of nootropics that is taking place this Thursday (29th March), from 7pm, in the Lord Wargrave pub at 42 Brendon Street, London W1H 5HE.  It’s going to be a semi-informal discussion, with attendees being encouraged to talk about their own experiences, expectations, hopes, and fears about nootropics.  Hopefully, the outcome will be improved collective wisdom!

25 March 2012

Smartphone technology, super-convergence, and the great inflection of medicine

Filed under: books, Connected Health, converged medicine, healthcare, Internet of Things, medicine — David Wood @ 10:07 pm

You are positioned to reboot the future of medicine…”

That’s the rallying cry that rings out from Eric Topol’s marvellous recent book “The Creative Destruction of Medicine”.  The word “Destruction” is meant in the sense elaborated by Austrian-Hungarian economist Joseph Schumpeter.  To quote from Investopedia:

Creative destruction occurs when something new kills something older. A great example of this is personal computers. The industry, led by Microsoft and Intel, destroyed many mainframe computer companies, but in doing so, entrepreneurs created one of the most important inventions of the century.

Topol believes that a similar transformation is underway in medicine.  His book describes at some length what he calls a “super-convergence” of different technological transformations:

  • Genomics, which increasingly indicates connections between individuals’ DNA sequences and their physiological responses to specific drugs and environmental conditions
  • Numerous small sensors – wearable (within clothing) or embeddable (within the body) – that can continuously gather key physiological data, such as blood glucose level, heart rhythm, and blood pressure, and transmit that data wirelessly
  • Improvements to imaging and scanning, that provide clearer information as to what is happening throughout the body (including the brain)
  • Enormous computing power that can manipulate vast amounts of data and spot patterns in it
  • Near ubiquitous smartphones, which can aggregate data from sensors, host all kinds of applications related to health and wellness, and provide early warnings on the need for closer attention
  • 3D manufacturing and synthetic biology, that can create compounds of growing use in medical investigation and bodily repair
  • The adoption of electronic medical records, that allow healthcare professionals to be much more aware of medical history of their patients, reducing the number of problems arising from unexpected interactions between different treatments
  • The emergence of next generation social networks binding together patients with shared interest in particular diseases, allowing crowd-sourcing of new insight about medical conditions
  • Enhanced communications facilities, that enable medical professionals to provide advice and even conduct operations from far-distant locations
  • Improved, free medical training facilities, such as the short videos provided by the Khan Academy.

Topol has an impressive track record as a leading medical practitioner, and gives every sign that he knows what he is talking about.  Importantly, he maintains a critical, skeptical perspective.  He gives plenty of examples of where technology has gone wrong in medicine, as well as when it has done well.  His observation of the application of accelerating technology to medicine is far from a utopia.  There are two sorts of problematic factors: technology factors (including the complexity of the underlying science), and non-technology factors.

First, the technology factors.  The ways that individuals react to different medical treatment vary considerably: a drug that saves one life can have terrible side effects in other patients.  What’s more, diseases that were formerly conceived as single entities now appear to be multiple in nature.  However, the move from “population medicine” to “individual medicine”, enabled by advances in genomics and by powerful data analysis, offers a great deal of hope.  For one example of note, see the Wall Street Journal article, “Major Shift in War on Cancer: Drug Studies Focus on Genes of Individual Patients“.  The core principle is that of ever improving digital analysis of data describing individual people – something that Topol calls “digital high definition of humans” leading to “hyperpersonalisation of healthcare… fulfilling the dream of true prevention of diseases”.

But the non-technology factors are just as significant.  Instead of the complexity of the underlying science, this refers to the structure of the medical industry.  Topol has harsh words here, describing the medical establishment as “ultra-conservative”, “ossified”, and “sclerotic” – existing in a “cocoon” which has tended to isolate it from the advances in information technology that have transformed so many other industries.  Topol calls for “an end of the medical priesthood… the end of an era of ‘doctor knows best'”.  Associations of medical professionals who seek to block patients from seeing their own medical data (e.g. a detailed analysis of their personal DNA) are akin, Topol says, to the medieval priests who fought against the introduction of printing and who tried to prevent church congregations from reading the bible in their own hands.

Given such criticisms, it’s perhaps surprising to read the wide range of positive endorsements at the start of the book, from eminent leaders of the medical industry.  This includes:

  • The global president of R&D for Sanofi
  • The professor of genetics from Harvard Business School
  • The chairman and CEO of Medtronic
  • The professor and vice-chair of surgery from NY Presbyterian/Columbia University
  • The chief medical officer from Philips Healthcare
  • The executive vice president and chief of medical affairs from United Health Group
  • The president of the Salk Institute for Biological Studies

and many others.  And for a growing list of reviews of the book, including from many people deeply embedded in the medical industry, see this compendium on the 33 Charts blog.  What’s happening here is that Topol is drawing attention to structural issues inside the medical profession, which many other people recognise too.  This includes risk aversion, long training cycles that place little emphasis on information technology, funding models that emphasise treatment rather than prevention, tests that are unnecessary and dangerous, and lengthy regulatory processes.

If the problem is structural, within the medical industry, the fix is within the hands of patients.  As per the quote I started with,

You are positioned to reboot the future of medicine…”

Here’s the longer version of that quote:

With the personal montage of your DNA, your cell phone, your social network – aggregated with your lifelong health information and physiological and anatomic data – you are positioned to reboot the future of medicine.

Topol advocates patients take advantage of the tremendous computational power that is put into their hands by smartphones, running healthcare applications, connected to wireless sensors, and plumbed into increasingly knowledgeable social networks that have a focus on medical matters – sites such as PatientsLikeMe, CureTogether, and many others.

There’s an important precedent.  This is the way business professionals are taking their own favourite smartphones and/or tablet computers into their workplaces, and are demanding that they can access enterprise systems with these devices.  This trend – “bring your own device” (“BYOD”) – is itself a subset of something known as “the consumerisation of enterprise technology”.  People buy particular smartphones and tablets on account of their compelling ease of use, stunning graphics, accessible multimedia, and rich suite of value-add applications covering all sorts of functionality.  They enjoy using these devices – and expect to be able to be use them for work purposes too, instead of what they perceive as clunky and sluggish devices provided via official business channels.  IT departments in businesses all around the globe are having to scramble to respond.  Once upon a time, they would have laid down the law, “the only devices allowed to be used for business are ones we approve and we provide”.  But since the people bringing in their own personal devices are often among the most senior officials in the company, this response is no longer acceptable.

Just as people are bringing their favourite smartphones from their home life into their business life, they should increasingly be willing to bring them into the context of their medical treatment – especially when these devices can be coupled to data sensors, wellness applications, and healthcare social networks.  Just as we use our mobile devices to check our email, or the sports news, we’ll be using these devices to check our latest physiological data and health status.  This behaviour, in turn, will be driven by increasing awareness of what’s available.  And Topol is on a mission to increase that awareness.  Hence his frequent speaking engagements, including his keynote session at the December 2011 mHealth Summit in Washington DC, when I first became aware of him.  (You can find a video of this presentation here.)  And hence his authorship of this book, to boost public understanding of the impending inflection point in medicine.  The more we all understand what’s available and what’s possible, the more we’ll all get involved in this seismic patient-led transformation.

Footnote: Topol’s book is generally easy to read, but contains quite a lot of medical detail in places.  Another book which covers similar ground, in a way that may be more accessible to people whose background is in mobile technology rather than medicine, is “The Decision Tree: How to make better choices and take control of your health”, by executive editor of Wired magazine, Thomas Goetz.  Both Topol and Goetz write well, but Goetz has a particular fluency, and tells lots of fascinating stories.  To give you a flavour of the style, you can read chapter one free online.  Both books emphasise the importance of allowing patients access to their own healthcare data, the emergence of smart online networks that generate new insight about medical issues, and the tremendous potential for smartphone technology to transform healthcare.  I say “Amen” to all that.

14 January 2012

Speaking of angels – visions of a world beyond

Filed under: books, irrationality, magic, paranormal, psychology — David Wood @ 1:03 am

How open-minded are you?

  • Suppose someone you’ve never met before takes a look at the palm of your hand, and shortly afterwards tells you surprising things about yourself – for example, about private issues experienced by your family, that no one else knows about.  What would your reaction be?
  • Or consider the case of people apparently leaving their bodies, whilst near death, and travelling around the neighbourhood in an out-of-body experience, observing hidden details that could only be noticed by someone high up in the sky.  Isn’t that thought-provoking?
  • Or what about reliable, trustworthy witnesses who return from spiritualist seances reporting materialisations and apparitions that the best conjurors of the day realise they could not possibly duplicate?
  • What about a president of the United States (Abraham Lincoln) who dreamed the details of his own death, in a precognition, several weeks ahead of that dreadful event?
  • What about someone who can cause the pages of a bible in another part of the room to turn over?  Or pencils to rotate?  Or solid steel spoons to bend and break?
  • Finally, what about a dog which springs to the window, seemingly knowing in advance that their owner has set off from work to return home, and will shortly be arriving at the house?

All these phenomena, and a lot more like them, are described in Professor Richard Wiseman’s recent book, “Paranormality: Why we see what isn’t there“.

At face value, these phenomena testify to the presence of powers far beyond the present understanding of science.  They suggest the existence of some kind of angelic realm, in which information can travel telepathically, from one brain to another, and even backwards in time.

One common reaction to this kind of report is to cough in embarrassment, or make a joke, and move on to another topic.

Another reaction is to become a debunker.  Indeed, Wiseman’s book contains some splendid debunking.  I won’t spoil the fun by sharing these details here, but you can bear in mind the apparently miraculous feats demonstrated right in front of spectators’ eyes by magicians like Derren Brown or “Dynamo“.  (As noted on his website, Wiseman “started his working life as a professional magician, and is a Member of the Inner Magic Circle”.)

However, “Paranormality” goes far beyond debunking.  Although some of the apparently paranormal events do have mundane explanations, for others, the explanation is more wonderful.  These explanations reveal fascinating details about the way the human mind operates – details that have only come to be understood within recent years.

These explanations don’t involve any actual transfer of disembodied thought, or any transcendent angelic realm.  Instead, they shed light on topics such as:

  • Circumstances when the mind can become convinced that it is located outside the body
  • Ways to pick up subliminal cues, by which people “leak” information to one another via subtle movements
  • The sometimes spectacular unreliability of human memory
  • Cognitive dissonance – how people react when, on the surface, prophetic statements have proven false
  • The functioning of dreams, linked to sleep paralysis
  • Circumstances when people feel that there’s a ghostly presence
  • Purposeful movements made by the body, without the awareness of the conscious mind
  • Limitations in the mind’s concept that it has free will.

The book also retells some dramatic historical episodes.  Some of these episodes were already familiar to me, from my days doing postgraduate research in the philosophy of science, when I looked hard and long at the history of research into the paranormal.  Others were, I confess, new to me – including an account of Michael Faraday’s investigation of the mechanics behind table-turning at seances.

The book has many practical tips too:

  • How to develop the habit of “lucid dreams” (when you’re aware that you’re dreaming)
  • How to impress people that you can (apparently) read their mind and discern hidden depths of their character
  • How to distract an audience, so that they fail to notice what’s right in front of them
  • How to organise a group of people around a table, so that the table apparently starts moving of its own volition
  • How to avoid losing control of your mind in circumstances when powerful persuasive influences operate.

In other words, rather than dismissing instances of apparent paranormal occurrences as being inevitably misguided, Wiseman suggests there’s a lot to learn from them.

I expect to hear more of the same theme later today, at the “Centre for Inquiry UK” event “Beyond the Veil – a closer look at spirits, mediums and ghosts“.  This is being held at London’s Conway Hall (one of my own favourite London venues).  Richard Wiseman is one of the speakers there.  The full agenda is as follows:

10.30 Registration (tickets will be available at the door)

11.00: Spirits on the brain: Insights from psychology and neuroscience – Chris French, Professor of Psychology and Head of the Anomalistic Psychology Research Unit at Goldsmiths, University of London

12.00: ‘Is there anybody there?’ – Hayley Stevens, a ghost hunter that doesn’t hunt for ghosts, who has been researching paranormal reports since 2005.

13.00: Lunch break

13.30: Mediums at Large – Paul Zenon, a professional trickster for almost thirty years, during which period he has appeared countless times as performer, presenter and pundit on numerous TV shows

14.00: Parnormality – Richard Wiseman, Professor for the Public Understanding of Psychology at the University of Hertfordshire

15.00: You Are The Magic – Ian Rowland, writer and entertainer with an interest in various aspects of how the mind works or sometimes doesn’t, who taught FBI agents how to be persuasive, and taught Derren Brown how to read fortunes

16.00: End

Postscript: Wiseman’s book contains a number of 2D barcodes.  The book suggests that readers should point their smartphones at these barcodes.  Their smartphones will then be redirected to short related movies on a special website, such as this one.  It was a pleasant surprise to be reminded of the utility of smartphones while my mind was engrossed in reflections of psychology.

1 January 2012

Planning for optimal ‘flow’ in an uncertain world

Filed under: Agile, books, critical chain, flow, lean, predictability — David Wood @ 1:44 pm

In a world with enormous uncertainty, what is the best planning methodology?

I’ve long been sceptical about elaborate planning – hence my enthusiasm for what’s often called ‘agile‘ and ‘lean‘ development processes.  Indeed, I devoted a significant chunk of my book “Symbian for software leaders – principles of successful smartphone development projects” to comparing and contrasting the “plan is king” approach to an agile approach.

But the passage of time accumulates deeper insight.  Key thinkers in this field now refer to “second generation lean product development”.  Perhaps paramount among these thinkers is the veteran analyst of best practice in new product development, Donald Reinertsen.  I’ve been influenced by his ideas more than once in my career already:

  • In the early 1990s, while I was a software engineering manager at Psion, my boss at the time recommended I read Reinertsen’s “Developing Products in Half the Time“. It was great advice!
  • In the early 200xs, while I was EVP at Symbian, I remember enjoying insights from Reinsertsen’s “Managing the Design Factory“.

I was recently pleased to discover Reinertsen has put pen to paper again.  The result is “The Principles of Product Development Flow: Second Generation Lean Product Development“.

The following Amazon.com review of the latest book, by Maurice Hagar, persuaded me to purchase that book:

This new standard on lean product and software development challenges orthodox thinking on every side and is required reading. It’s fairly technical and not an easy read but well worth the effort.

For the traditionalist, add to cart if you want to learn:

  • Why prioritizing work “on the basis of project profitability measures like return on investment (ROI)” is a mistake
  • Why we should manage queues instead of timelines
  • Why “trying to estimate the amount of work in queue” is a waste of time
  • Why our focus on efficiency, capacity utilization, and preventing and correcting deviations from the plan “are fundamentally wrong”
  • Why “systematic top-down design of the entire system” is risky
  • Why bottom-up estimating is flawed
  • Why reducing defects may be costing us money
  • Why we should “watch the work product, not the worker”
  • Why rewarding specialization is a bad idea
  • Why centralizing control in project management offices and information systems is dangerous
  • Why a bad decision made rapidly “is far better” than the right decision made late and “one of the biggest mistakes a leader could make is to stifle initiative”
  •  Why communicating failures is more important than communicating successes

For the Agilist, add to cart if you want to learn:

  • Why command-and-control is essential to prevent misalignment, local optimization, chaos, even disaster
  • Why traditional conformance to a plan and strong change control and risk management is sometimes preferable to adaptive management
  • Why the economies of scale from centralized, shared resources are sometimes preferable to dedicated teams
  • Why clear roles and boundaries are sometimes preferable to swarming “the way five-year-olds approach soccer”
  • Why predictable behavior is more important than shared values for building trust and teamwork
  • Why even professionals should have synchronized coffee breaks…

Even in the first few pages, I’ve found some cracking good quotes.

Here’s one on economics and “the cost of late changes”:

Our central premise is that we do product development to make money.  This economic goal permits us to use economic thinking and allows us to see many issues with a fresh point of view.  It illuminates the grave problems with the current orthodoxy.

The current orthodoxy does not focus on understanding deeper economic relationships.  Instead, it is, at best, based on observing correlations between pairs of proxy variables.  For example, it observes that late design changes have higher costs than early design changes, and prescribes front-loading problem solving.  This ignores the fact that late changes can also create enormous economic value.  The economic effect of a late change can only be evaluated by considering its complete economic impact.

And on “worship of conformance”:

In addition to deeply misunderstanding variability, today’s product developers have deep-rooted misconceptions on how to react to this variability.  They believe that they should always strive to make actual performance conform to the original plan.  They assume that the benefit of correcting a deviation from the plan will always exceed the cost of doing so.  This places completely unwarranted trust in the original plan, and it blocks companies from exploiting emergent opportunities.  Such behaviour makes no economic sense.

We live in an uncertain world.  We must recognise that our original plan was based on noisy data, viewed from a long time-horizon…  Emergent information completely changes the economics of our original choice.  In such cases, blindly insisting on conformance to the original plan destroys economic value.

To manage product development effectively, we must recognise that valuable new information is constantly arriving throughout the development cycle.  Rather than remaining frozen in time, locked to the original plan, we must learn to make good economic choices using this emerging information.

Conformance to the original plan has become another obstacle blocking our ability to make good economic choices.  Once again, we have a case of a proxy variable, conformance, obscuring the real issue, which is making good economic decisions…

Next, on flow control and the sequencing of tasks:

We are interested in finding economically optimum sequences for tasks.  Current practices use fairly crude approaches to sequencing.

For example, it suggests that if subsystem B depends on subsystem A, it would be better to sequence the design of A first.  This logic optimises efficiency as a proxy variable.  When we consider overall economics, as we do in this book, we often reach different conclusions.  For example, it may be better to develop both A and B simultaneously, despite the risk of inefficient rework, because parallel development can save cycle time.

In this book, our model for flow control will not be manufacturing systems, since these systems primarily deal with predictable and homogeneous flows.  Instead, we will look at lessons that can be learned from telecommunications networks and computer operating systems.  Both of these domains have decades of experience dealing with non-homogeneous and highly variable flows.

Finally, on fast feedback:

Developers rely on feedback to influence subsequent choices.  Or, at least, they should.  Unfortunately, our current orthodoxy views feedback as an element of an undesirable rework loop.  It asserts that we should prevent the need for rework by having engineers design things right the first time.

We will present a radically different view, suggesting that feedback is what permits us to operate our product development process effectively in a very noisy environment.  Feedback allows us to efficiently adapt to unpredictability.

To be clear, Reinertsen’s book doesn’t just point out issues with what he calls “current practice” or “orthodoxy”.  He also points out shortcomings in various first generation lean models, such as Eliyahu Goldratt’s “Critical Chain” methodology (as described in Goldratt’s “Theory of Constraints”), and Kanban.  For example, in discussing the minimisation of Work In Process (WIP) inventory, Reinertsen says the following:

WIP constraints are a powerful way to gain control over cycle time in the presence of variability.  This is particularly important where variability accumulates, such as in product development…

We will discuss two common methods of constraining WIP: the kanban system and Goldratt’s Theory of Constraints.  These methods are relatively static.  We will also examine how telecommunications networks use WIP constraints in a much more dynamic way.  Once again, telecommunications networks are interesting to us as product developers, because they deal successfully with inherently high variability.

Hopefully that’s a good set of tasters for what will follow!

30 December 2011

2012 resolution resolution

Filed under: books, psychology — David Wood @ 6:46 pm

It’s the season for new year’s resolutions.  But before composing a new year’s resolution list, some questions:

  • How important is resolve?
  • Should we prioritise self-control?
  • Does willpower matter?

In their recent book “Willpower – rediscovering the greatest human strength“, pioneering psychology researcher Roy F. Baumeister and New York Times science writer John Tierney have a great many positive things to say about willpower and self-control.  Their analysis provides a timely counterbalance in a world that is generally suspicious of thrift and self-denial, and that tends, instead, to value “self-esteem”, “anything goes”, and “if it feels good, do it”.

I consider this to be a very practical book, on a topic that has been overlooked for too long.

Early in the book, the authors provide this summary of recent changed opinions within social science research:

Baumeister and his colleagues around the world have found that improving willpower is the surest way to a better life.

They’ve come to realise that most major problems, personal and societal, centre on failure of self-control: compulsive spending and borrowing, impulsive violence, under achievement in school, procrastination at work, alcohol and drug abuse, unhealthy diet, lack of exercise, chronic anxiety, explosive anger.

Poor self-control correlates with just about every kind of individual trauma: losing friends, being fired, getting divorced, winding up in prison.  It can cost you the US Open, as Serena Williams’s tantrum in 2009 demonstrated; it can destroy your career, as adulterous politicians keep discovering.  It contributed to the epidemic of risky loans and investments that devastated the financial system, and to the shaky prospects for so many people who failed (along with their political leaders) to set aside enough money for their old age…

People feel overwhelmed because there are more temptations than ever.  Your body may have dutifully reported to work on time, but your mind can escape at any instant through the click of a mouse or a phone.  You can put off any job by checking email or Facebook, surfing gossip sites, or playing a video game…  You can do enough damage in a ten-minute online shopping spree to wreck your budget for the rest of the year.  Temptations never cease…

The book contains very interesting reports of how well-known people nurtured stronger willpower – such as the magician and “endurance artist” David Blaine, the 19th century explorer Henry Stanley Morton, personal effectiveness pioneer Benjamin Franklin, and recovering alcoholics such as guitarist Eric Clapton.   It also summarises the results of numerous psychology experiments.  There’s lots of practical advice:

  1. Willpower gets depleted over time; however, supplies of willpower can be replenished by food and rest
  2. Self-control exercised in one region of our life (e.g. to resist eating tempting food) depletes the immediate store of willpower we have for other regions of our life (e.g. not to lose our temper); we don’t have separate supplies of different kinds of willpower
  3. The same observation has a positive side to it too: exercising willpower in some areas of life, and building greater stamina there (over time) – for example, sustained piano practice, or a discipline of meditation or prayer – typically builds better willpower (over time) in other areas too
  4. Temporary reserves of willpower can be reinstated by eating foods that provide a quick release of sugar – though a more sustainable longer term approach is to eat healthily on a regular basis
  5. Willpower can also be augmented when we have better feedback on what we are doing – for example, when we see ourselves in a mirror, or when we record aspects of our health daily (such as our weight), or when a trusted friend or colleague is aware of our goals and discusses our progress with us
  6. Willpower can also be augmented when we see our efforts as fitting into a larger framework or community, which can be seen as a “higher power” – such as a religious, political, or humanitarian cause
  7. The best use of willpower is to design our lives to minimise the impact of potential distractions and temptations.  This includes the above advice on healthy eating, adequate rest, as well as having a less cluttered life.

To elaborate the final point, here’s a summary of some research described in the final chapter of the book:

Researchers were surprised to find that people with strong self-control spent less time resisting desires than other people did…  Self-control is supposedly for resisting desires, so why are the people who have more self-control not using it more often…?

But then an explanation emerged.  These people have less need to use willpower because they’re beset by fewer temptations and inner conflicts.  They’re better at arranging their lives so that they avoid problem situations…

People with good self-control mainly use it not for rescue in emergencies but rather to develop effective habits and routines in school and in work…  They use their self-control not to get through crises but to avoid them.  They give themselves enough time to finish a project; they take the car to the shop before it breaks down; they stay away from all-you-can-eat buffets.  They play offense instead of defence…

The advice on having a less cluttered life applies to the set of goals we set ourselves.  Baumeister and Tierney are not keen on lengthy lists of new year’s resolutions.  Worrying about goal number 4 on the list, for example, is likely to limit our ability to concentrate on goal number 2 on the list:

The first step in self-control is to set a clear goal.  Self-control without goals or other standards would be nothing more than aimless changes, like trying to diet without any idea of which foods are fattening.

For most of us, though, the problem is not a lack of goals but rather too many of them.  We make daily to-do lists that couldn’t be accomplished even if there were no interruptions during the day, which there always are.  By the time the weekend arrives, there are more unfinished tasks than ever, but we keep deferring them and expecting to get through them with miraculous speed.  That’s why, as productivity experts have found, an executive’s daily to-do list for Monday often contains more work than could be done the entire week.

Worse, there are often latent conflicts between different goals.  With too many goals:

  • People worry too much – the more competing demands someone faces, the more time they spend contemplating these demands
  • People get less done – they replace action with rumination
  • People’s health suffers, physically as well as mentally; they paid a high price for too much brooding.

For this reason, even before I get to my own list of new year’s resolutions, I know that the underlying principle is going to be:

  • Do less, in order to make a better job of the things that matter most.

That’s my 2012 “resolution resolution”.

Factors slowing the adoption of tablet computers in hospital

Filed under: Connected Health, mHealth, security, tablets, usability — David Wood @ 12:35 pm

Tablet computers seem particularly well suited to usage by staff inside hospitals.  They’re convenient and ergonomic.  They put huge amounts of relevant information right in the hands of clinicians, as they move around wards.  Their screens allow display of complex medical graphics, which can be manipulated in real time.  Their connectivity means that anything entered into the device can (in contrast to notes made on old-world paper pads) easily be backed up, stored, and subsequently searched.

Here’s one example, taken from an account by Robert McMillan in his fascinating Wired Enterprise article “Apple’s Secret Plan to Steal Your Doctor’s Heart“:

Elliot Fishman, a professor of radiology at Johns Hopkins… is one of a growing number of doctors who look at the iPad as an indispensable assistant to his medical practice. He studies 50 to 100 CT scans per day on his tablet. Recently, he checked up on 20 patients in his Baltimore hospital while he was traveling in Las Vegas. “What this iPad does is really extend my ability to be able to consult remotely anytime, anywhere,” he says. “Anytime I’m not at the hospital, I’m looking at the iPad.”

For some doctors at Johns Hopkins, the iPad can save an hour to an hour and a half per day — time that would otherwise be spent on collecting paper printouts of medical images, or heading to computer workstations to look them up online. Many doctors say that bringing an iPad to the bedside lets them administer a far more intimate and interactive level of care than they’d previously thought possible. Even doctors who are using an iPad for the first time often become attached, Fishman says. “Their biggest fear is what if we took it away.”

However, a thoughtful review by Jenny Gold, writing in Kaiser Health News, points out that there are many factors slowing down the adoption of tablets in hospital:

iPads have been available since April 2010, but less than one percent of hospitals have fully functional tablet systems, according to Jonathan Mack, director of clinical research and development at the West Wireless Health Institute, a San Diego-based nonprofit focused on lowering the cost of health care through new technology…

UC San Diego Health System’s experience with iPads illustrates both the promise and the challenge of using tablet technology at hospitals. Doctors there have been using the iPad since it first came out, but a year and a half later, only 50 to 70 –less than 10 percent of physicians– are using them…

Here’s a list of the factors Gold notes:

  1. The most popular systems for electronic medical records (EMRs) don’t yet make apps that allow doctors to use EMRs on a tablet the way they would on a desktop or laptop. To use a mobile device effectively requires a complete redesign of the way information is presented.  For example, the EMR system used at UC San Diego is restricted to a read-only app for the iPad, meaning it can’t be used for entering all new information.  (To get around the problem, doctors can log on through another program called Citrix. But because the product is built on a Windows platform and meant for a desktop, it can be clunky on an iPad and difficult to navigate.)
  2. Spotty wireless coverage at the hospital means doctors are logged off frequently as they move about the hospital, cutting off their connection to the EMR
  3. The iPad doesn’t fit in the pocket of a standard white lab coat. Clinicians can carry it around in a messenger bag, but it’s not convenient
  4. There are also worries about the relative newness of the technology, and whether adequate vetting has taken place over patient privacy or data security.  For example, as my former Symbian colleague Tony Naggs asks, what happens if tablets are lost or stolen?
  5. Some clinicians complain that tablet computers are difficult to type on, especially if they have “fat fingers”.

Let’s take another look at each of these factors.

1. Mobile access to EMRs

Yes, there are significant issues involved:

  • The vast number of different EMRs in use.  Black Book Rankings regularly provide a comparative evaluation of different EMRs, including a survey released on 3 November 2011 that covered 422 different systems
  • Slower computing performance on tablets, whose power inevitably lags behind desktops and laptops
  • Smaller display and lack of mouse means the UI needs to be rethought.

However, as part of an important convergence of skillsets, expert mobile software developers are learning more and more about the requirements of medical systems.  So it’s only a matter of time before mobile access to EMRs improves – including write access as well as read access.

Note this will typically require changes on both the handset and the EMR backend, to support the full needs of mobile access.

2. Intermittent wireless coverage

In parallel with improvements on software, network improvements are advancing.  Next generation WiFi networks are able to sustain connections more reliably, even in the complex topography of hospitals.

Note that the costs of a possible WiFi network upgrade need to be born in mind when hospitals are considering rolling out tablet computer solutions.

3. Sizes of devices

Tablets with different screen sizes are bound to become more widely deployed.  Sticking with a small number of screen sizes (for example, just two, as in the case with iOS) has definite advantages from a programmers point of view, since fewer different screen configurations need to be tested.  But the increasing imperative to supply devices that are intermediate in size between smartphone and iPad means that at least some developers will become smarter in supporting a wider range of screen sizes.

4. Device security

Enterprise software already has a range of solutions available to manage a suite of mobile devices.  This includes mechanisms such as remote lockdown and remote wipe, in case any device becomes lost or stolen.

With sufficient forethought, these systems can even be applied in cases when visiting physicians want to bring their own, personal handheld computer with them to work in a particular hospital.  Access to the EMR of that hospital would be gated by the device first agreeing to install some device management software which monitors the device for subsequent inappropriate usage.

5. New user interaction modes

Out of all the disincentives to wider usage of tablet computers in hospitals, the usability issue may be the most significant.

Usability paradigms that make sense for devices with dedicated keyboards probably aren’t the most optimal when part of the screen has to double as a makeshift keyboard.  This can cause the kind of frustration voiced by Dr. Joshua Lee, chief medical information officer at UC San Diego (as reported by Karen Gold):

Dr Lee occasionally carries his iPad in the hospital but says it usually isn’t worth it.  The iPad is difficult to type on, he complains, and his “fat fingers” struggle to navigate the screen. He finds the desktop or laptop computers in the hospital far more convenient. “Are you ever more than four feet away from a computer in the hospital? Nope,” he says. “So how is the tablet useful?”

But that four feet gap (and it’s probably frequently larger than that) can make all the difference to the spontaneity of an interaction.  In any case, there are many drawbacks to using a standard PC interface in a busy clinical setting.  Robert McMillan explains:

Canada’s Ottawa Hospital uses close to 3,000 iPads, and they’re popping up everywhere — in the lab coats of attending physicians, residents, and pharmacists. For hospital CIO Dale Potter, the iPad gave him a way out of a doomed “computer physician order entry” project that was being rolled out hospital-wide when he started working there in 2009.

It sounds complicated, but computerized physician order entry really means something simple: replacing the clipboards at the foot of patient’s beds with a computer, so that doctors can order tests, prescribe drugs, and check medical records using a computer rather than pen and paper. In theory, it’s a great idea, but in practice, many of these projects have failed, in part because of the clunky and impersonal PC interfaces: Who really wants to sit down and start clicking and clacking on a PC, moving a mouse while visiting a patient?

Wise use of usability experience design skills is likely to result in some very different interaction styles, in such settings, in the not-too-distant future.

Aside: if even orang utans find ways to enjoy interacting with iPads, there are surely ways to design UIs that suit busy, clumsy-fingered medical staff.

6. Process transformation

That leads to one further thought.  The biggest gains from tablet computers in hospitals probably won’t come from merely enabling clinicians to follow the same processes as before, only faster and more reliably (important though these improvements are).  More likely, the handy availability of tablets will enable clinicians to devise brand new processes – processes that were previously unthinkable.

As with all process change, there will be cultural mindset issues to address, in addition to ensuring the technology is fit for purpose.  No doubt there will be some initial resistance to new ways of doing things.  But in time, with the benefit of positive change management, good new habits will catch on.

29 December 2011

From hospital care to home care – the promise of Connected Health

Filed under: challenge, Connected Health, converged medicine, healthcare, mHealth, usability — David Wood @ 12:01 pm
  • At least one in four hospital patients would be better off being treated by NHS staff at home

That claim is reported on today’s BBC news website.  The article addresses an issue that is important from several viewpoints: social, financial, and personal:

NHS Confederation: Hospital-based care ‘must change’

The NHS in England must end the “hospital-or-bust” attitude to medical care, says the body representing health service trusts.

At least one in four patients would be better off being treated by NHS staff at home, figures suggest.

2012 will be a key year for the NHS as it tries to make £20bn in efficiency savings by 2015, according to the head of the NHS Confederation, Mike Farrar.

Ministers say modernising the NHS will safeguard its future.

Mr Farrar said: “Hospitals play a vital role but we do rely on them for some services which could be provided elsewhere.

“We should be concentrating on reducing hospital stays where this is right for patients, shifting resources into community services, raising standards of general practice, and promoting early intervention and self-care.

“There is a value-for-money argument for doing this, but it is not just about money and the public need to be told that – this is about building an NHS for the future.”

Mr Farrar said the required changes included treating frail people in their homes, and minimising hospital stays wherever possible.

Politicians and NHS leaders must show the public how these changes could improve care, rather than focusing on fears over the closure of hospital services, he added.

“Many of our hospitals know that the patients that they are treating in their beds on any given day could be treated better – with better outcomes for them and their families – if they were treated outside of hospitals in community or primary care,” he told BBC Radio 4’s Today programme.

Mr Farrar told Today that people had become used to “the hospital being a place of default” and that primary and community healthcare services had sometimes been under-funded.

But he said even where clinicians knew that better care could be provided outside of hospitals, and politicians accepted this privately, the public debate had not helped individuals understand that…

Some of the replies posted online are sceptical:

As a medical doctor based in hospitals, I believe this will not work logistically. Patients are sent to hospitals as they don’t get the specialist care in the community as the skills/services are inadequate/not in place. Patient attitudes must change as many come to a+e against GP advice as they don’t have confidence in community care…

As long as the selfish British public can’t be bothered looking after their own relatives and see hospitals as convenient granny-dumping centres, there is absolutely no way this would work.

There can not be a perfect solution. Not every family can care for a sick person full time, often due to them working. Hospital care may not be a perfect, yet in some cases it does free relatives to be able to work.  Outsourcing care too has a major downside, my wife has done that for years. 15 mins twice a day, can hardly be called acceptable if you apply some form of dignity to the patient.

I saw too many patients I nursed(often elderly or with pre-existing health conditions) kept in hospital too long because no one to care for them at home/wider community. This wasn’t great for them but also blocked an acute bed for someone else. In recent years the pendulum’s swung too far the other way: too many patients discharged without adequate support…

In summary: care in the community would be better in many, many cases, but it’s demanding and challenging:

  • There are social challenges: relatives struggle to put their own lives and careers on hold, to act as caregivers.
  • There are financial challenges: funding for medicine is often preferentially directed to large, centralised hospitals.
  • There are skills challenges: observation of complicated chronic health conditions is more easily carried out in the proximity of specialists.

However, the movement “from hospital care to home care” continues to gather steam – for good reason.  This was a major theme of the mHealth Summit I attended earlier this month in Washington DC.  I was particularly struck by a vision articulated by Rick Cnossen, director of worldwide health information technology at Intel:

In the next 10 years 50% of health care could be provided through the “brickless clinic,” be it the home, community, workplace or even car

As reported in the summary article by Kate Ackerman, “mHealth: Closing the Gap Between Promise and Adoption“:

Cnossen said the technology — such as mobile tools, telehealth, personal health records and social networking — already exists to make this possible. He said, “We have the technology. … It’s time to move out on it.”

Fellow speaker Hamadoun Toure, secretary general of the International Telecommunication Union took up the same theme:

Mobile phones will increase personal access to health information, mHealth and broadband technology will improve data collection and disease surveillance, patient monitoring will improve and become more prevalent, and remote consulting and diagnosis will be enhanced, thanks to low-cost devices.

“In the near future, more people will access the Internet through mobile devices than through fixed devices,” Toure said. “We are witnessing the fastest change in human history, and I believe (we have) a great opportunity for social development.”

Connected health technology enables better remote monitoring of personal medical data, earlier warnings of potential relapses, remote diagnostics, quicker access to technical information, better compliance with prescription regimes, and much, much more.

But Kate Ackerman raises the question,

So if the technology already exists and leaders from both the public and private sectors see the need, why has progress in mobile health been slow?

It’s an important question.  Intel’s Rick Cnossen gives his answer, as follows:

“The challenge is not a technology problem, it’s a business and a workflow problem.”

Cnossen said, “At the end of the day, mHealth is not about smartphones, gadgets or even apps. It’s about holistically driving transformation,” adding, “mHealth is about distributing care beyond clinics and hospitals and enabling new information-rich relationships between patients, clinicians and caregivers to drive better decisions and behaviors…”

He said health care clinicians can be resistant to change, adding, “We need to introduce technology into the way to do their business, not the other way around.”

Cnossen also said that payment reform is essential for “mHealth to survive and thrive.” He said, “We should not be fighting for reimbursement codes for each health device and app. That is ultimately a losing proposition. Instead, we must fight for payment reform to pay for value over volume, regardless of whether the care was provided in a bricks and mortar facility or was it at the home or virtually through electronic means.”

Personally, I would put the emphasis differently:

The challenge is not just a technology problem, it’s also a business and a workflow problem

Moreover, as the technology keeps on improving, it can often diminish the arguments that are raised against its adoption.  Improvements in quality, reliability, miniaturisation, and performance all make a difference.  Improvements in usability may make the biggest difference of all, as people find the experience in using the new technology to be increasingly reassuring.

I’ll finish by noting an excerpt from the keynote at the same conference by Kathleen Sebelius, Secretary, U.S. Department of Health and Human Services:

This is an incredible time to be having this conversation. When we talk about mobile health, we are talking about taking the biggest technology breakthrough of our time and using it to take on one of the greatest … challenges of our time. And while we have a way to go, we can already imagine a remarkable future in which control over your health is always within hand’s reach…

This future is not here yet, but it is within sight. And I look forward to working with you to achieve it.

« Newer PostsOlder Posts »

Blog at WordPress.com.