Friday, 10 June 2016

IChemE Hazards 26 Conference - some of my highlights.

​The IChemE Hazards 26 Conference was held in Edinburgh 25-26 May 2016. 

As ever, there is a lot in the proceedings, and I found this conference to one of the harder ones I've attended to pick which session to attend, as there were a lot of times where at least 2 papers were on in parallel that I wanted to see!

KEYNOTES:

The keynote speakers included:

Sir Charles Haddon Cave: - Nimrod XV230 Disaster.

He had a number of key lessons (as a lawyer, he is very strong on lists), including:

"Complexity is the enemy of good safety".

The cause of the incident included:

1. Poor design: the incident was caused by fuel coming into contact with a hot surface (above 450°C).  The modification to the aircraft that made these hot surfaces available to come into contact happened in the 1960's

2. History of Fuel Leaks throughout operation - these became seen as "normal".

3. Increase in operational tempo - heavy use of planes from Kosovo onwards.

4. Maintenance of aging aircraft - based on a 1950's design, with end of life being extended and extended.

5. Major organisational change and lack of funding.

6. Outsourcing of the safety case - they had a piece of paper that gave them a "warm feeling", but 40% of the risks ameliorated on paper were still live.

7. Dilution in responsibility, accountability and the associated processes.

He was particularly worried about a culture of "Paper Safety", where powerpoints slides were used as an alternative to thinking, rather than prompting thinking.

He encouraged us to keep asking questions, as "Questions are the antidotes to assumptions that so often result in mistakes".

Cheryl Grounds - BP VP Process Safety

Cheryl spoke to her history in Mobil - BP, and illustrated some of the advances and challenges she saw based on her development in safety.  She clearly embodied a drive to make safety a central part of BP, and having a seat at the upper areas in BP to make the case for process safety being good business.

Alan Chesterman - Apache

Alan spoke to the approach within Step Change in Safety to manage best practice and make learning better across incidents.  He particularly spoke about the "Joined Up Thinking" initiative within Step-change and the way that they are changing the incidents databases to allow sharing, rather than inhibit it.

He had a powerful anecdote that his personal commitment to process safety came from having survived a 30 tonne gas release and explosion early in his career, and his hope that others don't have to have quite as personal an incident to remember.

Ken Rivers - Chair COMAH Strategic Forum

A really engaging speaker, he showed a document that was going to be entitled "Process Safety Management......" but is now more like "Managing the NON-FINANCIAL risks that could destroy your company".  He is on a journey to try to make best practice common practice among their members and beyond.

Dr Paul Logan - HSE

He gave a good overview of the position of the regulator, and their strategies towards finding the places where the control measures that are critical and most vulnerable to failure exist.

Other Key Papers:

Vent Release, Maersk GPIII.

There were two papers which summarised experiments carried out by HSL to deal with 4 vent issues on GPIII.  These looked at potential ignition sources and experiments to determine if a burning liquid droplet from the flare could ignite a vent, and also looked at the areas around a cold-vent where ignition could occur.  This concluded that some ignitions happened even when the predicted LEL from PHAST was a low as 10%.  They recommended that to account for eddy flow and uncertainty, a sphere of radius = the distance to 50%LEL along the centreline of the release would account for the ignition (an approach similar to IP-15 for area classification).

Hindsight Bias

The IChemE Safety Centre have released a set of training material focused on giving people experience of making some of the key decisions that were in the run-in to major incidents.  The issue in general is that people will focus on the "obvious" cause of the incident once known, rather than if they would have made the same decision.  They've put 200+people through the courses so far, and a large number (she said 90%) of people made the same decisions that ended up leading to the accident.  Looks like a powerful tool in the training armoury.  See http://www.icheme.org/media_centre/news/2016/tackling-memory-distortion-to-improve-process-safety.aspx.

Riser ESDVs

A paper was presented on the performance of Riser ESDVs based on the RIDDOR database.  This concluded a few interesting points:

- Majority of failures were of valves 20-25 years old (although there would be a large population here due to post-Piper improvements).

- Most valves that had failed had failed more than once.

- A number of valves had not been tested before they failed for multiple years.

- On average the PFD was in the 1-2% range, which is in line with the kind of figures you would get using OREDA-type valve failure rates.

And finally

Revalidating HEART

HEART (or the Human Error Assessment and Reduction Technique) is one of the simpler ways of considering human error probability for use in fault trees/event trees.  It was originally developed in the 1980's based on general tasks.  The HSE have worked with the original author to check the underlying probabilities and error producing factors, and found that the model still holds true.  A revised handbook for the method is planned for publication this summer.

In discussion, it was pointed out that the typical operator is probably not more likely to make a mistake in general, we just give them less warning and harder things to do now with fully automated control and shutdown systems than before.

Friday, 20 November 2015

Do the right job, then do the job right.

Abstract Accepted....

When you put an abstract in for a conference, you are never really sure how it's going to be received.  So when you get an email back from the IChemE saying "your abstract has been accepted", there are two emotions at the same time: the first is that you are delighted that what you thought might be important to talk about has been accepted, the second is the realisation that now you have to turn a 250 word abstract into a 2-3,000 word paper!  The deadline is 16 Jan 2016, and the prize is to speak at the IChemE Hazards 26 conference in Edinburgh, May 2016.

It's not that bad actually....

In this case, it's not actually that bad, as the topic is one close to my heart, and that I and the teams I work with have been dealing with for a number of years.  The full title of what we're talking about this time is "Do the Right Job, then Do the Job Right - Dealing with Process Safety in Appraise and Select."  We've assisted a number of companies with this, from the basics of carrying out small assessments to determine what a job entails and how much it might cost, through to re-engineering company management of change programmes to embed the assessment in the way that things are done.

What I plan to have in the paper....

The paper will address process safety in two ways:

  1. How can you deal with jobs that provide risk improvement, and compare those to jobs which may increase revenue, reduce downtime, reduce environmental or business risk?
  2. How can you make decisions on the process safety aspects of a job at the appraise stage, where there is limited time and the design concept may not be much more than a proposal, rather than a design?
Item 1 will deal with what we refer to as "inappropriate prioritisation".  This includes things like:
  • Prioritisation by who shouts the loudest
  • Prioritisation by deferring to the most senior person in the discussion
  • Prioritisation by creative/spurious use of the word "safety"
  • etc,
It will also talk about the difficulties whereby it can be tempting to have a set of ranked lists of projects, and doing the top n of each one until the money runs out.  The topic of "ALARP", and how this useful concept in UK style safety law can be difficult to apply in other regions of the world, is likely to figure. 

Item 2 will deal more with the challenges of early hazard identification, and dealing with "Management of Change" as defined by the CCPS, as "not UNKNOWINGLY introducing new hazards or making existing hazards worse".  This will address both design for risk reduction, and assessment of other design for risk aspects.

What won't be in the paper....

We will be drawing on experience of many companies across the globe.  We won't be mentioning any specific installation, and any examples quoted will only be identifiable if you already know the job and installation.

How you might help....

I'd be delighted to get a bit of input from anyone who is interested in this topic.  Any wider perspective, be that from inside or outside the UK, inside or outside of the EU, inside or outside of the oil and gas industry, would be excellent.  Please contact me, preferably via LinkedIn, if you're interested.  I'll happily share my thoughts in development, and return any favours I can.

Thanks in advance....

Monday, 22 June 2015

Like a boxed set, only better

I'm not sure when box sets went from being something you bought as a present on Father's Day to being the way that people consume TV, but it's clearly happened.  If it's not "Game of Thrones", it's a new comedy, or a new murder thriller series from Scandinavia. 

The adverts on the TV all seem to imply that the best way is to pick a boxed set, and then binge-watch until you have caught up.  Which is how I think I am currently treating a great resource recommended to me by Richard Cousins at BP. 

It's called "DisasterCast", and is a podcast series by Drew Rae, (see http://www.griffith.edu.au/humanities-languages/school-humanities/staff/drew-rae).  I've listened to maybe twenty of these so far, and they are uniformly excellent, thought provoking, and informative.  I've mentioned them to a few colleagues, but am clearly ahead of them so far in my listening.  I know I'll be liberally quoting from his thoughts from now on.

The podcasts are available http://disastercast.co.uk/, and can be downloaded on all major podcast software, including iTunes.

If BBC Radio 4 describes itself as "broadsheet radio", then this is "broadsheet safety broadcasting".  Simply superb.

Wednesday, 11 February 2015

What Ancient Troy has to do with Safety.

I had the joy of studying Latin in school, and a lot of what we studied revolved around the stories of Troy, the foundation of Rome, the various Gods, and their dealings with the mortals.  One character that always stood out to me was Cassandra, sister of Paris, the Prince of Troy who steals Helen as his bride and brings about the 10 year war.  Cassandra was given the gift of prophecy by the God Apollo, but in one version of the story, she then could foresee the fall of Troy, and fell out with Apollo.  He in turn was not allowed to take back the gift, but rather changed it so she would be still able to foretell the future, just that nobody would believe her.

"What has this to do with offshore safety?", you might ask.  Well, we are now in a challenging, cost-focused environment (again).  And there will be significant pressure to reduce workload offshore, (again).  And there will be temptation to trim back to the bare minimum again.

The Chair of the HSE, Judith Hackitt, summarised it well in her guest editorial in "The Chemical Engineer", the journal of the Institution of Chemical Engineers of which she was president up to last year.  The full text is available here.

She is pointing out that we've been here before.  We got it wrong before.  And it's entirely foreseeable that we will get it wrong again, if we focus ONLY on the cost, and not on the risk.  And while most people will not actively cause harm, it's entirely foreseeable that poor decisions we take now, or even postpone for now, will cause delayed harm to our colleagues at the work-face.

If safety is a core value for our companies, like the words on the wall might say, we need to demonstrate that as much now as in the better times.  The point of principles is that they should guide how we behave when it is not easy to stick to them.  Let us not be like Cassandra, and see the needless loss of what we and others hold dear.

Tuesday, 29 July 2014

Those who can, do. Those who understand, explain......

Note: this article appeared in an edited format in the July-August Issue of "The Chemical Engineer" magazine, and also in the Atkins internal "Inside Energy" Magazine.

Background.....

Seveso III – Public Information
The European Commission updated the Seveso Directive in line with the Aarhus Convention [1] on public information, public participation in decision-making and access to justice on environmental matters.  This is quite a culture shift from the requirements in the Seveso II Directive.

In line with the Aarhus Convention the changes will mean that:
  • the level and quality of information for the public will need to be improved, particularly for those likely to be affected;
  • active provision of information – not just on request;
  • the information will need to be available electronically and kept up to date.
(Source, HSE Website http://www.hse.gov.uk/seveso/public.htm)

It’s a scene familiar to many of us.  The HAZOP Chair asks “And are we comfortable that the safeguards you mentioned are sufficient against this hazard?  Can you explain that to us?”  And the process engineer is at the centre of attention, expected to have the facts at her fingertips.  But it’s not a major crisis.  The design is well done.  The hazards, although complex, are understood, and the approach being taken may be novel to some, but is the right solution.  So the answer is given, the team gives it consideration, the scribe records the discussion, the process moves on.

Many of us work in complex, high hazard industries.  The application of safety in design, ergonomics, risk assessment, process safety, means that for the vast majority of time, the hazards stay as potential, not actual harm.  We are degree qualified engineers, chartered, even fellows of our institutes.  We play our part, get the job done.  Within our offices, within the plant, within the technical bubble we work in, our jargon, our common understanding, our acronyms and buzzwords communicate to our peers, our operators, and sometimes our management, that we have our risks under control.
Picture the same engineer, but this time at the local town hall.  The new regulations being introduced across the EU in 2015, shaped by our experience and the experiences of our neighbouring countries, mean that the public will soon have access to more information about our high hazard plants, the substances we routinely deal with, the controls that we have in place.  But she’s now outside of the technical bubble, and the audience doesn’t have the knowledge and understanding she’s used to.  They’re not unintelligent; they may well have complex jobs, managing complex situations.  But SIL, LOPA, QRA are not part of their daily lives.  They have no idea what a “mitigated event likelihood” is, or how we set and achieve a target.  Chemical formulae and jargon may well make them feel stupid.  Would that engineer be able to explain that all is under control?

But that is exactly the challenge that Judith Hackitt set us at the Hazards 24 conference this May in Edinburgh.  We are the people who understand the risks, and deal with the balance of keeping risks as low as reasonably practicable.  And with our understanding, and our skills, we should be able to explain to the man or woman on the street how we can and should have facilities managing major accident hazards, maybe not directly in their “back-yard”, but certainly close enough to them to be affected by our work.
Lee LeFever makes explanation videos for a living. In his November 2012 book, “The Art of Explanation”, he presents an A-Z scale of knowledge and understanding.  We are topic experts.  We sit in our bubble, at the WXYZ part of the scale.  But the people we need to reach are back at the start of the scale, and with our jargon, impenetrable language, and science stuff, we leave them behind a large wall.  The cost to them of understanding what we are on about is huge.  Why should they even bother trying?


It should be simple:  an explanation describes facts in a way that makes them understandable.   But to make that explanation work, we need to have the ability and the courage to picture ourselves in another person’s shoes, and communicate to them from that perspective.   However, a good explanation is not always easy.  It needs to start from the context of what the audience can understand, and set the scene.  If the concept is difficult, we need to find analogies to things people do understand, that don’t trivialise the issue, but do lower the cost of understanding.

For example, we deal with the concept of ALARP across industries, and in many legal jurisdictions around the world.  We all can grasp the concept that society does not seek to eliminate all risk.  For us engineers, this means that we need to make the maximum impact that we can, but we don’t have infinite time or money to solve all the problems.  And there’s only so far we can go.  We can aspire towards zero harm, but the law recognises that’s not always possible. 

If we were to extend the ALARP concept to, say, speed limits, we wouldn’t necessarily end up with a strict hierarchy of 20 mph near schools, 30 mph in built up areas, 60 mph on A roads, 70 mph on dual carriageways.  When the school is emptying of children, we could do a lot of harm at less than 20 mph, and 10 minutes later do no harm at all at a higher speed.  60 mph would not be appropriate in freezing fog, or driving rain.  If we happened to be all alone on a four lane motorway in good conditions, we could probably safely drive at 100 mph or more with no consequences for us or for the rest of the world.

Our safety case regime means we take an approach to the hazards, in this case the speed, that’s proportionate to the risk.  We look at the kind of risks we want to control, like we might plan our journey past that school, and onto the motorway.  We think about the likely scenarios we would face, and whether our local situation would influence the likelihood of snow, ice, and rain.  We would examine the features of our car, and how its design, control and safety systems would help us with the journey.  We would publish our case to the regulator about how we would keep the risks on this journey to a sufficiently low level to meet our expectations, and the expectations of the wider society.  We would have to keep the regulator informed about this for as long as we were planning to carry out such journeys.

[As an aside, Montana in the USA did operate an ALARP speed limit regime, in that state law asserted that "A person ... shall drive the vehicle ... at a rate of speed no greater than is reasonable and proper under the conditions existing at the point of operation ... so as not to unduly or unreasonably endanger the life, limb, property, or other rights of a person entitled to the use of the street or highway."  American case law does not however share the UK legal tradition of “reasonableness”, and this approach was ruled unconstitutional in 1998.]

The preceding example illustrates that while ALARP can be impenetrable for some, with a little creativity, we could develop explanations for people that would help them over the high-cost barrier, let them understand why they should care about what we do, and put them in the market for better information on the risks.  But such explanations are not readily available for us to cut and paste.  We need to generate these as industries, as professions.

It’s not hard to imagine the consequences if we don’t.  While we are all closer to some major hazard sites than we may realise, we can all be guilty of relegating them to the “someone else is looking after that so we’re grand” category.  In my 16 mile commute to Aberdeen, I cross three major pipelines, but don’t think too much about the management of that risk.  And on these small islands, if we don’t do our jobs well, we may well end up beyond NIMBY – Not In My Back Yard, to what Stephen Norris called “BANANA”, “Build Absolutely Nothing Anywhere Near Anybody”.  We can’t afford for that to happen.

The challenge is out there.  We need to explain better, and we need to think more about how we explain.  We know we are professionals, that we are part of amazing industries doing fundamentally important things, maintaining life and living standards for many.   We all have “explainer” in our job titles, whether we want to or not.

Further reading:

“The Art of Explanation”, Lee LeFever, Wiley and Sons, 2012
“The Back of the Napkin”, Dan Roam, Marshall Cavandish 2009

Thursday, 24 October 2013

My Top Six iPad Apps

A friend in our Houston office has just bought an iPad, and asked for some app recommendations. I said I'd email her, but then thought it might be as easy to write a blog entry about it instead, and maybe get some recommendations from the network. So here goes.

1. Notability


The best app I have for taking notes, and what I use for doing my favourite engineering task, checking reports.  The screenshot above is from Ailsa Munro's first draft of her M.Eng thesis.  You can type, highlight, and write onto an imported pdf, and email the result back.  Linked in with Dropbox, I've saved a few saplings so far with this one.

2. Dropbox


This app is simplicity itself, and allows seamless picking up and saving of data between my desktops and the iPad.

3. Paper by 53


A simple but incredibly well executed drawing program.  When you see this one on the App Store a lot of people are asking for more control, more choices. But they're missing the point, and the designers are resisting well.  It turns a simple sketch into something that looks professional. I've used it for all my illustrations for talks this year, and the feedback has been very flattering.  It's not that my drawing is superb, but this makes it as good as it can be.

4. Myscript Calculator


This turns your finger writing into a full scientific calculator.  Great fun, and handles long sums really well.  And this one is free.

5. Meteogram


This uses the Norwegian weather service forecast to fit an entire week's weather info into two charts. And so much information too, from cloud depth, weather icons for each period, temperature, wind and direction, and rainfall, for the next seven days.  Infographics at it's best.

6. Mindjet Mindmanager

This is a great free application on its own.  But add in the fact that the PC version is superb and compatible, and that you can use Dropbox to keep the files synchronised on the desktop and iPad, it's a great tool.


And honourable mention to....

Facebook, Twitter and LinkedIn.  The apps are now as good as the websites.

And a recommendation for any fan of the Irish Times, a subscription to this.  Great newspaper anywhere.









Monday, 27 May 2013

Explain Yourself....

The phrase "Explain yourself" is one that in the wrong context can send shivers down my spine.  It brings back memories of the Dean of Discipline in my secondary school, probing the extent of a fellow pupil's misdemeanour to help define the punishment. 

In our professional context, though, it's something we probably do every day.  And I'm thinking about it tonight as I just finished the draft of an article, and I'm hoping I got it right.  From simply coaching a member of staff in a new concept, through to detailed justification of the results of our work, explaining is one of those skills that sets apart a consulting engineer from a good design engineer. 

But like any skill, it can be improved.  I'm not sure exactly why I first stumbled on Lee Lefever's book "The Art of Explanation", but once I bought it, it's been a significant contribution to how I approach explaining, and even to how I explain explaining. 

One of the best bits of the book is an A-Z scale of explanation.  As illustrated here, we do often live in a bubble where we expect that everyone else knows what we know.  On an A-Z, we're maybe at W or X, and a lot of our peers are in a similar place.  But not everyone is.

I remember after 3 years in chemical engineering being a bit amazed that some of the arts students we met didn't know what Reynold's number was, as it was in at least one lecture we had per week, or even at times one a day.  For us, it was like someone not knowing that "A" was a letter, or even what an alphabet was.  But normal people did live outside our bubble, and Re was not a big part of their life.  What I now do is ask people on the scale where they are, and try to tailor my explanation accordingly. 

Do check out the book, and see if it helps.  My copy is on loan to another good explainer I know, but I've got the Kindle version on the iPad for emergencies.

[And by the way, Reynolds number is an expression of the ratio of inertial forces (how easily things flow) and viscous forces (how difficult other things make it to flow), and is a fundamental concept across chemical engineering.  Actually, maybe I'm a little outside that bubble now, and could use someone else's explanation.  Still remember the formula, though.]