Off the beaten path: rethinking quality in the startup world
Join me to explore how startups deliver value to their customers despite constraints, in contrast to enterprise organisations.
We'll unpack the distinctive challenges faced by startups (especially an Agtech startup), delve into what quality means in this context, and discuss how they navigate the delicate balance between perfection, progress, and innovation.
Transcripts
[Aaron Hodder] So, Nevetha is a seasoned product professional and has worked at various start-ups, including Sharesies and more recently, at the Agtech start-up, Hectre. To give us an insight as to what quality means in her context, please join me in welcoming Nevetha Mani.
[Applause]
[Nevetha Mani] Hey, everyone. Aaron first said he doesn't want me to preach. I'm not here to preach. You're experts in this field.
A little bit about me. I started my career as an engineer and then did several things. I was a scrum master, coach, and now I'm a product manager. I was at Sharesies, and then more recently at Hectre. Which is an AgTech start-up based in New Zealand.
You might be wondering, why is a product manager here, talking to a bunch of testers at a testing conference?
I'm not here to talk about your field. But I'm here to talk about quality, and the war stories I've learned moving from large organisations into start-ups, and how that has changed my perception of what quality is.
I want to start the talk by actually focusing on a little bit about the AgTech space and why it's important.
There's a little bit of trivia, if you can jump in and join for the QR code or go to the website and join there. Great.
Sorry, the quiz won't be displayed here. It's on the device.
Wow. Almost everyone got that right.
[Laughter]
Why I wanted to bring this in. It’s a leeway to what I was doing at Hectre. I recently left Hectre a couple of weeks ago to start my own stuff.
As you can see, according to the International Food and Agriculture Organisation, 33% of food that’s produced globally goes to waste.
And Hectre's on the mission to ensure the sustainability of food production. That’s their focus.
So, how does an AgTech start-up, sitting here in New Zealand, help decelerate this food wastage?
To understand that, let's look at the fresh food supply chain.
We all get all our produce in the supermarkets, that's what we know.
This is what the fresh food supply chain looks like. You've got a bunch of growers, growing stuff.
And they send all the fruit produced, that was harvested, to Pack Houses.
Pack Houses is essentially a warehouse, with specialised conditions to store your fresh produce.
As Pack House started to get orders from around the world, New Zealand is one of the biggest exporters of raw fruits. We send apples, cherries, and everything all around the world. They’re then sent to distributors, who send it to restaurants, grocers, and consumers.
What Hectre does is, they've got two products.
The first product focuses on the grower end of the supply chain, to reduce food wastage.
The second focuses on the Pack House side of the supply chain.
I was working on the product which was on the growers side.
They look at everything that's happening in an orchard. From how many trees you're planting, what chemicals you're spraying on the trees, who was picking the fruits, and in what weather conditions you’re picking.
You've got all this data. Now, all the farmers can make informed decisions about the quality of their produce, how much waste is being generated, how they can improve their yield, and how they can generate more revenue.
The Pack House product is our premium one.
What that product does is, as fruit is sent on big trucks and they enter these Pack Houses, we've got mounted cameras, which have computer vision and machine learning.
As the truck goes in, it takes a snapshot of the top of the struck, and assesses the produce for quality.
Fruit is graded based on size and colour.
How red is an apple? Say, your Royal Gala’s red, would be different from Pink Lady’s red. So there's different grades, different sizes.
As they go into Pack Houses, you need to know this quality and they have to put it in an appropriate room for optimum storage. Because Pack House accepts fruits from hundreds of farms around, all Royal Galas have to be stored in one room.
How many of you here knew that apples are only harvested only three months in a year?
Gosh, you all knew that! Before I joined, I thought they come all year round.
They're put in these Pack Houses, and they're there for one year. They put it in that room, and seal it with nitrogen so that it stays fresh.
Once you break the seal, you have to optimally distribute the food, otherwise it goes to waste.
So, these were the two products.
I've always worked in enterprises before I went to Sharesies. And then when I went to Hectre, it's a whole different world.
Most of the products I worked on, (I was at Southern Cross) all the products...the challenges were different. It was more to understand the user's context.
Coming here, the whole industry was different.
First one is field connectivity.
For most products, people are sitting in the offices. You’ve got internet connectivity, you’re good to go.
But then you take most AgTech products; be it Hectre or Halter. You’re out and about in the fields. There is no internet connectivity what-so-ever.
So what happens is, your software should be designed to work offline. Once you come back online, there's a unique set of challenges.
There’s synchronisation issues. Two people are going to be doing the same thing, there's data duplication. How do you handle all this?
So all the features, applications, models, we design, have to cater for both online and offline mode.
Seasonal usage. This is a doozy.
All the software I worked with...you’ve all used Canva, Miro, everything right?
You've got users using it day, day-in and day-out. You’ve got real world usage everyday.
For me, if we look at the product, especially the grower end, you've got three months where the product is used. Then for the rest of the time, when it’s not harvest season, people use it...maybe 5% to 10% of the time.
So what this means is, the T word that nobody likes. Every product, every feature that we release, has strict timelines.
I get a lot of flak from my team when I say
“Oh, we’ve got a timeline. We have to release it by December, because it’s harvest season for apples in January.”
Everything has to be modelled around that timeline.
If we miss a harvest season, we lose revenue for the whole year for that country, because we have to wait until the next harvest season.
When I was working at enterprise organisations, even Sharesies, you don't have to worry about these things.
Once you go to start-ups, you're so much closer to revenue.
Being in a product role, you're seeing it. So you have to emphasise all those timelines, which the teams don't like.
It’s very important to bring the engineering teams on the journey. So they understand why timelines are important. Why you don't need to craft perfect solutions. You just need something usable, that the users can start using.
The third one is geography needs.
Hectre's products are used by customers in Europe, North America, South America, New Zealand, Australia...they’re now going into South Africa as well.
Every country has different regulatory and compliance requirements. You have to define workflows that cater to it.
You're importing to different countries. For example, New Zealand is primarily an export market. Versus U.S., a primarily domestic market.
The workflows are entirely different. So your product experience and quality should also be centred around the geographical needs of that market.
Even in the labour critical markets. For example: New Zealand and U.S. are labour critical markets, because there's a labour shortage. They bring in seasonal workers, so your workflow has to be different.
We compare with South Africa; a labour abundant market. They don't mind throwing much labour into it, so you have to define a different workflow.
You’ve got one model that has to cater to all these things.
User base diversity.
We've got users who are very, very tech savvy. There's a few customers who take the data out of our Hectre app, integrate it with Snowflake, use PowerBI, and they've got real time statistics. They start planning next year's harvest, based on the last few year’s data.
At the other end of the spectrum, we've got customers who are not so literate. One customer said, I want red and green, so I don't need to know what it is. Red button is, don't do it.
You have to design a software that caters to both ends of this spectrum.
When I looked at all this and then came into quality.
As a product manager, my core job is to create value. Create value to a customer, but also to the business.
It's not just about delivering what the customer wants, but what generates revenue.
When I look at what a product's quality means, I like this very simple definition:
“Delivering a product or service that meets or exceeds customer expectations.”
Like Shay was saying, quality is not a job of one team. It's not a QA team.
At both Sharesies and Hectre, we never had the luxury of having a QA team with us, because they're not big enough. So everybody shares that quality aspect.
For example, my job as a product manager is to understand the customer context and bring the right information to the engineering teams. Versus the technical and delivery teams, which are engineers and testers, who make sure that the context is translated into solutions. For a marketing team, quality means they're marketing the right modules, right products, to the right customer bases. So everybody pitches in, with quality.
Now just taking a segway. I want to talk about two use cases, and how that's changed my perception of what is acceptable. What does it mean to deliver a quality product to market?
This first use case is about usability.
The module I worked on last year, it's called a spray diary.
What it means is you have to maintain a spray diary for all users. It is a list of agrochemical spray applications that goes on in a farm. They need to maintain everything they're spraying for two reasons.
One is for compliance and regulatory reasons. You're allowed to use only chemicals allowed by certain regulations.
The second one is for insights. You could be spraying growth chemicals. You want the insights into...
How is it affecting my yield?
Is it impacting any diseases that's coming through? You need to have all this data recorded.
What am I spraying?
Which location am I spraying at?
Who's doing the spray?
What were the weather conditions when I was spraying?
So they can use these insights to model next year's data.
How we work is, we work as product trios. We do a Discovery, and then a Delivery. We've got an engineer, a designer, and a PM (myself). We designed a prototype and said, okay, let's go test it out.
You can see on a Google map, the big block in white is...the location is usually a bunch of orchards. A customer will have many farms; they’re called orchards. It is made up of multiple blocks. Every block is subdivided into sub blocks. As you can see on the map.
So, how we design, it's very straightforward.
A user comes in and says “Today, I'm going to spray on this orchard”.
Then he selects one or two blocks he’s going to spray, and multiple sub-blocks where they’re going to spray. This is how it was.
We took this prototype to users, and we tested it. Boy, oh, boy, we were so wrong. How users think was completely different.
In this context, they’ll say
“I'm going to spray all the Royal Gala today, because they may need the same growth inhibitors. So I’m going to spray everything at the same time.” Or
“I'm going to spray all Royal Gala in orchards X, Y.” Or
“I'm going to spray it here, but not here, because maybe it's right next to a school. Can't spray on a weekday, can only do it on the weekend.”
That's how users think. We didn't have the context.
This was literally…once the prototype was ready, we did a Friday demo to the whole team.
We were like, we have nailed it! This was our first project.
And then we went down to Nelson. We tested it with eight customers. Not a single one of them like it. They’re like “Oh, that’s, like, 10 clicks I have to do. No, I can't do this. This is a busy season. We will not be doing it”
The job the users were trying to accomplish was different. The context was different.
When they were going to pick fruits during the harvest season, you're sending workers to harvest fruit.
What we had previously, made sense. Orchard blocks and sub blocks made sense in that context, when we were going to pick fruits.
But for the job they had today, which was spraying, it didn't make sense because they were trying to accomplish two different things.
Had we gone into delivery straight away, without testing with users, the quality of the experience would be subpar, and people wouldn't have onboarded to that module.
Second, the use case I wanted to give is about efficiency.
I think efficiency is the less talked about aspect of quality. In enterprise organisations, we don't focus on it much.
What it means is, does it let the users to accomplish their task with their ease?
I want to talk about an onboarding example.
Hectre is a B2B product. Which means, it’s sales-lead onboarding.
Sales team goes and signs a deal, closes a deal, then we have to onboard our customers in the app.
Onboarding usually involves a few things. They have to configure farm structures. Sometimes it's a hundred hectares. Some of the customers got like, thousand hectares.
You have to configure their farm structure, register all the staff members, customise their timesheets and payroll. Everybody does time sheets differently: Overtime, double time, different geographies have different needs and regulations. Everything has to be customised.
When I joined the company, this was one of the things my customer success manager said:
“Oh my team spends a lot of manual time doing this. Can you fix this?”
In the current state, what has happened is: Sales team signs the deal, and then the deal comes into a customer success team or an operations team. They manually set every customer up. So there's a lot of back and forth with a customer. They're asking, how does your timesheet work? Is there a payroll integration we have to do?
It takes between 20 to 40 hours, depending on the size of the farm, and then users can start using the app.
How many of you think that's most efficient?
Nobody. That's what I thought, as well.
So we designed an ideal state. We said “Okay, what is it going to look like?”
We pulled in a tech lead and said “What does an ideal state look like here?”
Once a deal is signed, the sales team can put it through, an account has to be provisioned automatically, and users can onboard themselves. There’s no manual hand-holding. That’s the ideal Nirvana state.
After prioritisation, that was the state we ended up with.
My customer success team was not happy. That's the initial state.
If I had put my engineering hat on, I would have gone with the ideal state. Being an engineer, you want to get your hands on all the latest technology. You want to use analytics, insights, and everything.
I want to talk about progress over perfection.
Once we finished this ideal state, we just designed it, we haven't delivered the code. We haven't even touched it. We designed what the workflow looks like. We did a little bit of service design.
We spoke to the sales team, and talked to recently onboarded customers. They were like, nah. They really liked the current state. They really liked the personalised onboarding experience.
Most of our customers are not very tech savvy, so they really like the handholding. Even if it was taking a lot of time going back and forth.
The industry norm is, in the Agtech space, you're not signing 10 customers every day. You might close 10 deals in a quarter based on the season.
So our customer success team was not too overwhelmed by it. So we said, it makes sense for today. When we're onboarding 10 to 15 customers every week, that's when we need to revisit it. So this was deprioritised.
Some of these things… I know efficiency's important. We have to do it, but it's okay today, to go with whatever existed.
And going through the last three years, working in start-ups, my mindset has slightly changed of what it means to have quality…what quality means.
When I used to work at enterprise organisations, stability and scalability was very, very important. Because you have to make sure your products and features are reaching a much larger customer base, and they are stable. Every deployment is stable.
In start-ups, you're more agile. You’re more about speed to market. You need to bring in more money. So you have to go to market much, much faster. You don't have the luxury of sitting and testing everything. Making every single thing perfect.
The second one I see, in my perception, is that most enterprises are risk conscious or risk aware, because they have an established customer base. They want to minimise disruptions.
Versus in start-ups, you're more experimental. We'll just release changes, and use mixpanel or whatever analytics to see how users are using it. If we see that users are struggling, we remove the feature. Everything is done live.
But by this, what I'm saying is, it's not okay to release an unusable, defective product. It's okay to go in with a solution that is usable today, accumulate the tech debt for now, but you can fix it later.
And the whole time being in start-ups, my mindset has slightly changed. Slightly evolved.
One of the key takeaways I would like to emphasise here is: Understand, adapt, and deliver to context.
You could design the perfect solutions, right? You can have everything, but if it doesn't match the customer context, it's not usable. Thereby degrading quality.
It is very important to sit down and talk to customers. If you have the luxury of engaging with customers directly, do it.
In most of my teams, I work as trios or quads.
Usually, myself, my design lead, tech lead, or anyone on the team: data, it could be anyone. We sit down with customers to do Discovery. It’s not me doing Discovery, passing second hand information. I bring everyone to Discovery as well.
If you don’t have the luxury of talking to customers directly, you have an operations team, customer success team, or other customer facing teams. They have as much information.
During my time at Southern Cross Health insurance, I sat with our call centre people. They know how customers use it. They're the subject matter experts. You get so much knowledge from them. So I recommend spending as much time with customer facing teams, to understand what are the jobs that customers are trying to accomplish. As QA engineers, you're playing a pivotal role, which ensures that everything that leaves the door is fit-for-purpose, and is usable. So having that context will help making sure that it’s fit-for-purpose.
The second takeaway is, you've got context, which is usually set by the product managers. “Why are we doing this?”
The solutions come in the end, and the designer does something in the middle. If that is all detached, we'll still be writing use cases, or test cases without understanding how that context came into solutions.
I would recommend spending a lot of time with your product and design people during the Discovery phase, not just after Delivery. It shouldn't be a handoff between Discovery to Delivery. Can you see that the information flow makes sense? If not, seek that clarity.
This is my most favourite takeaway: embrace agility over perfection.
This approach lets us test real time usage. Gather real feedback, and then improve upon that feedback.
If you’re sitting getting everything perfect… This is me 10 years ago. If you asked me when I was a developer, I would be like “85% code coverage. All this has to be automated”
So my perception has changed now.
You need to bring money in the door for start-ups. Which means that you’re testing with real users.
And users are, from what I've seen, at least, not very pedantic about whether this button has to be on the left or right. They're not testing every end use case. So it's okay to release something that's not 100% perfect, because you've got the luxury of time later, when you sign more customers up.
That's pretty much my talk.
I hope I've provided you with enough context of what product managers see as quality, and help you go further. Thank you.
[Applause]