The SEO Sprint

Share this post

Product and Engineering Wisdom - Issue #20

www.theseosprint.com

Product and Engineering Wisdom - Issue #20

Tips, advice and guides on experimentation and discovery

Adam Gent
Oct 21, 2021
Share this post

Product and Engineering Wisdom - Issue #20

www.theseosprint.com

A free fortnightly email that highlights the relevant tips, advice, and case studies from the world of product and engineering for the SEO community.


Hello 👋,

In this issue, the topic is experimentation and discovery.

Successful product teams use experiments and discovery sprints to get buy-in, reduce risk and continuously learn.

Stay safe and enjoy.

Adam


📰 Newsletter News

I wanted to let you know that I’m currently working on writing my own newsletters based on my experience in product and as an SEO (not just round ups!).

It has been a busy year but I'm prioritising time for writing.

I’ve calculated that even if I wrote 1 a week that would be 8 months of newsletters 😯. I'll have to do some kind of course at this rate...


⚡Wisdom of the Sprint

Stay Humble or Be Humbled — Case Studies from the Experimentation

by Anthony Rindone

Time: 30 mins ⏲️

Anthony Rindone, Product Manager for Split, talks about how to make product teams can make better decisions using experiments.

Adam’s Insight: Did you think SEOs were the only ones who do experiments?

Successful Product and UX teams actually spend time in discovery sprints trying to learn more about the customer's needs through A/B split tests, experiments and minimal viable products (MVPs).

When you test your ideas as early as possible it is very humbling for customers to prove your idea isn't great (or even useful). The most important thing about experimenting is continuous learning. This principle of building and learning quickly — to prove hypotheses, prioritize feature requests and mitigate risk for big (resource-heavy) ideas.

What I like about Anthony's webinar is that he admits what he learned from each experiment and how he'd do it differently in the feature. This is exactly how product teams (in my experience) use experiments.

If you'd like to read general product experimentation this is a good guide here, and a famous SEO experiment guide from Pinterest here.


✨Product Wisdom

Discovery – Feedback

by Marty Cagan

Time: 10 mins ☕

Marty Cagan, Silicon Valley Product Group, highlights common ways product teams try and get out of getting critical user feedback.

Adam’s Insight: In Product teams, it's critical that you get feedback as soon as possible on an idea.

However, teams can always come up with excuses to not get customer feedback.

These excuses can include:

  1. The truth is painful for those who developed the idea

  2. There is no time to gather feedback

  3. Confirmation bias means people seek confirmation (not feedback)

  4. Little sense of ownership of the product or feature.

Marty provides ways to overcome these excuses and problems.

If you're an in-house SEO or SEO Consultant who gathers qualitative and quantitative customer feedback these tips can be applied.

Representing Customer Segments on Your Opportunity Solution Tree

by Teresa Torres

Time: 10 mins ☕

In this blog post, Teresa Torres discusses how to include customer segments in an opportunity solution tree.

Adam’s Insight: Opportunity solution trees are a simple method of visualising your thought process.

Source: https://www.producttalk.org/opportunity-solution-tree/

Product teams use visual models to map out user journeys, tell stories and get buy-in. This is yet another method that, if you're struggling with a problem, can help you discover potential customers.

Source: https://www.producttalk.org/2021/10/customer-segments-teresas-take/

If you're an SEO interested in learning a visual method to map out and discover different customers and their pain points, I recommend giving this a read.

I'd recommend using a tool like Miro to map out your opportunity trees.


⚙️Engineering Wisdom

Interpreting A/B test results: false positives and statistical significance

by Martin Tingley et al

Time: 20 mins ⏲️

This is part 3 of How Netflix uses A/B split testing. In part 3 the Netflix engineering team talk about interpreting A/B split testing results and interpreting statistical significance. You can read part 1 and 2 below:

  • Part 1 - Decision Making at Netflix

  • Part 2 - What is an A/B Test?

Adam’s Insight: Interpreting A/B split test results is tricky even for product and engineering teams.

What I enjoyed about this blog post was that it is an accessible guide on interpreting complex A/B results. For example, they explain false negatives and false positives using cats and dogs!

Source: https://netflixtechblog.com/interpreting-a-b-test-results-false-positives-and-statistical-significance-c1522d0db27a

If you're still getting your head around A/B split testing then this series is a good introduction to how an enterprise company like Netflix run experiments.


Share The SEO Sprint Newsletter

Share this post

Product and Engineering Wisdom - Issue #20

www.theseosprint.com
Comments
TopNewCommunity

No posts

Ready for more?

© 2023 Substack Inc
Privacy ∙ Terms ∙ Collection notice
Start WritingGet the app
Substack is the home for great writing