This is a blog a I wrote late 2015 and just discovered sitting in my drafts. As I read this blog I can still remember the project, the questions and the problematic discovery. I can also remember that this was classified as a simple change and “waterfalled” to me. I thought the blog worth sharing, along with a heuristic of mine. If people start talking about how “easy” a project is, be alert, there will be dragons. That false sense of security and acceptance is often a lack of vigilance and critical thinking. Be aware, be alert, be an active questioner.
“An implicit part of your preparation and your mission is to recognize, analyze, exploit, and manage your emotional states and reactions.” – Michael Bolton, Developsense
I’m sitting in a meeting room, key people from the Development team are with me and a Technical Engineer. The reason for the gathering – I found a bug while testing and the bug has pointed out the need for a fundamental change to deliver the desired functionality to our client. This problem has been exposed because we “messed with the Laws of Nature”. Maybe that’s overly dramatic but we have made a change and failed to fully appreciate the nuances. The way this issue has surfaced, and some subsequent discussions, has had me reflecting. This blog is about some of those reflections and resultant observations.
“When you find yourself mildly concerned about something, someone else could be very concerned about it.”– Michael Bolton, Developsense
It’s more than a feeling (more than a feeling)
When I hear that old song they used to play (more than a feeling)
And I begin dreaming (more than a feeling)
Boston – More than a feeling
I’m talking about emotion because right from my initial involvement in this project my gut feeling was that something was not quite right. It felt like the solution had oversimplified and under considered the changes. I don’t doubt that tight timelines impacted what was done compared to what could have been done on the design analysis side. We were changing an area of complexity where no one fully understands the “inner workings” or complexities in a holistic way (it’s a big legacy system). The client desire focused on improving performance, but, in doing so, we had created a hybrid being. It had features of 2 types of existing entities but was neither. Those feelings had me asking lots of questions, challenging ideas, but I failed to ask the right question.
“If they can get you asking the wrong questions, they don’t have to worry about answers.”
― Thomas Pynchon, Gravity’s Rainbow
The change involved cash accounts, think bank account if you wish. To meet client desires we needed to create an account that effectively did not accrue interest but had some interest accrual features. We underestimated the complexity of this task and as a result our models and oracles were poor. An assumption had been made about behaviour, the assumption was wrong and I was unable to frame questions to expose the assumption. I’m realistic enough to also realise a bunch of other people failed to expose the assumption as well. Up until yesterday the testing results actually aligned with our expectations, our models seemed OK.
“The scientist is not a person who gives the right answers, he’s one who asks the right questions.”
― Claude Lévi-Strauss
A number of times my testing raised issues where I felt there were consistency issues. These were explained away. This is not to say I was dismissed, or the questions not listened to. The explanations were reasonable but generally left me with the idea that, at minimum, we might have been missing some opportunities to deliver a better outcome in terms of consistency. Part of the confusion resided in a non accrual account type now supporting interest entries. This meant that when I tried to talk about process within the interest function it was never the “whole function” as we had known it. It becomes easier to justify behaviour when you think purely in terms of the previous functions and don’t really think about how that has now been twisted. Our abstractions were leaky but we didn’t see it. Perhaps, because the results and our oracle seemed to align, I became a little complacent. Maybe, just maybe, I questioned the answers to my questions a little less than I should have.
“Most misunderstandings in the world could be avoided if people would simply take the time to ask, “What else could this mean?”
― Shannon L. Alder
I was testing the penultimate functional area. We hadn’t made any direct changes in this function but the changes we had made would flow through here and so I needed to test and see that it looked OK. This function is an interesting one, not one I’ve spent a lot of time in recently, and it had the potential to be challenging. First thing I decided to do was check the module parameters. There were 4 parameters related cash transactions. Two of these related directly to elements of the enhancement changes. This was cool because it gave me a chance to see my test data, and outputs, in a new way. I would be able to componentise aspects of the data. I could quite possibly find issues in here that would not be easily exposed via other functions I had tested.
I ran a test against data I had used in other functions. I had modeled the data and had specific outcomes I expected to see. I got, literally, zeros. I stopped, did a double take, checked some parameters, valuation dates, did I use the right accounts? Everything checked out so I executed the function again. Same result. This was really out of the blue, everything so far had checked out (it wasn’t bug free but it wasn’t “on fire” either). I ran configurations of the two primary parameters that interested me and ran them on the valuation date I was interested in, plus and minus one day. There were anomalies I just could not explain, the results on my primary valuation point were just bizarre. I sent my spreadsheet and some other details to the Developers and Business Analyst – “Hey guys, I can’t explain this. Can you have a look so we can work out what’s going on.”
They did, and I’m told, within 20 minutes of starting the review of my data, outcomes and the code they realised our solution was fundamentally, and fatally, flawed. It could not deliver what the client desired. I was somewhat happy this had come to light before release. While I couldn’t specifically target the right question when I wanted, my gut feeling had been right. The emotional discordance had a basis. My testing approach had also enabled me to eventually find where the software was broken. I had learnt along the way and applied that learning as I went.
Since realising that we had a considerable problem we’ve had a few discussions, mostly around recovering the situation. There are lessons for our Development floor, things that we could have, should have done. The potential to find this before we wrote a single line of code was missed. The opportunity to discuss with our clients how they would use this new functionality and the report values they would desire were not taken up. If they had been we would have had examples that would have allowed us to determine, before writing a line of code, that this project was not simple, and possibly, not even desirable.
“The Wilderness holds answers to more questions than we have yet learned to ask.”
― Nancy Wynne Newhal
For me the lessons are simpler, they revolve around questions:
- On what basis are you making your assumptions?
- How do you know your assertions about outcomes are correct?
I asked these questions but not in an effective way, I should have questioned my questions, rephrased them into better questions and asked those.
Then there are answers. When the answer doesn’t completely resolve your disquiet, when your gut feel is that there might be something missing, something important, keep pursuing that hunch.
3 thoughts on “The question not asked”
You have possibly overlooked one scenario – that what the client wanted wasn’t even possible, that the client’s own idea was something that their peers might have said was stupid, that the client’s CEO woke up one morning with a wild idea that no-one else in the company dared to challenge or could talk them out of despite knowing it to be the product of a diseased mind.
Quite how you get around that scenario, I don’t know. Possibly if you have more than one channel of communication into the client’s organisation, you might be able to find allies who can persuade the client-side Product Owner that “this is a stupid idea, sir” whilst you engage in technical sounding activities that are merely allowing the wheels to spin whilst at best only demonstrating the ultimate futility in devoting any further time to the project (but still ensuring you get paid for it).
Best of luck with that.
It’s an interesting situation you raise. The company I was at, at the time, would not entertain that type of conversation. I can remember projects where those conversations were held internally but would never be pushed out to clients. I can remember raising a question along similar lines for the project I wrote about. Not because of the request itself (from memory it was a requirement based on trading rules in the country of this client) but because of the way we proposed to tackle it. Cost had a lot to do with the chosen approach. However at my last company, I worked in 2 teams and in both we actively pushed back on some stories because they appeared to not make sense (either in the “what do they want” or “why are we doing this?” sense). That had one of 2 outcomes. The story returned with better definition around value or it was canned. That ability to push back, with reason, all the way back to your customers, is a key component, IMO, to producing high quality software and building valuable client relationships.
LikeLiked by 1 person