Navigation

Stuff I've Learned About Product Management

Nearing four years into product roles for "enterprise product management," I've learned a few practical lessons that the product, UX, or leadership books, websites, and podcasts seem to gloss over or assume.

Specifically, remembering your audience creates a strong pressure on your communication skills. Roadmaps expire and you'll recreate them over and over again. And you have to be willing to break the "rules."


Remember your Audience

All the product management and ownership advice, books, and podcasts I've absorbed in these past years recognize that product people work with a fairly large cross-section of both their organization and customers.

And all the lessons in communication and writing suggest starting with your audience in mind.

Combine these and you have several slices of the big picture, different vocabulary and terms across various groups, and different levels of detail across mediums.

For example, I regularly write or explain a feature from different perspectives and even at different levels of detail for the same kind of colleague. I need to be able to:
  • Explain an end-user or technical feature in terms of business impact, feasibility (time), and impact to organizations
  • Break down a user story in terms of the jobs it supports to give context to a UX designer
  • Break down a separate user story in terms of the implementation details, addressing scope, acceptance criteria, and nonfunctional requirements
  • Break down the same thing with much fewer words for those already familiar with the change
  • Describe existing features to those unfamiliar to them
  • Describe expected features in a future release
Some organization, a bit of empathy and a sense of humor can help adjust the message to your audience.

Groundhog Day Lists

Roadmap. Vision. Themes.

I've learned various ways to organize, prioritize, or otherwise determine what to build next within and across teams.

You might work out dependencies across teams using sticky notes and string or the digital equivalent. you might use MoSCoW: must-have should have, could have, and won't have. Maybe WSJF is your thing (weighted shortest jobs first) or some other form of weighted scoring. Votes are popular and you might even have customers vote with fake money or digital votes on their favorite ideas.

Pareto's principle seems to come up a lot where 80% of the planning sessions we'll reserve 20% of our time on the 20% of user actions that can improve things 80%. But since it gets harder and harder to predict big or far-off features, you might use Fibonacci sequence numbers for sizes or even for priority as well as time-boxing or smaller-and-smaller windows.

Personally, I don't think the specific techniques matters as much as a few basic principles:
  • Make the process and calculations transparent
  • Involve those that would be impacted by the plan, from other departments to customers
  • Ultimately own the decision
One interesting side effect of such lists is that they are created and/or seen at moments in time. It's fairly easy to stay in sync with your immediate colleagues, perhaps using a sprint planning board. However, the broader stakeholders and customers are hearing about your plans at different times from different people.

Just keep in mind that once you have a good roadmap, you'll have to create or revise the plan and then share it with the audiences mentioned above. And after that, you'll revise and share the plan with the audiences mentioned above. And after that...

Breaking the Rules

UX, product, and development practices espouse things like agile methods, minimum viable products, customer research, continuous integration/delivery/learning, and customer-centricity.

But there is no perfect one-size-fits-all process to apply or buy.

Don't Focus on the How, Most of the Time...

As a product owner, I'm supposed to focus on the "why" and "what," rather than the "how." But when changing technologies, addressing our nonfunctional requirements, and working with different people of various experiences with the product, sometimes I do have to care about how things are implemented.

As an example, our software has a mechanism to store user-specific profile data such as language and region in our system's database in "application data." Long-time developers might assume all user-related data should be stored in the same way.

But things are changing. For example, identity providers and other systems could store user-specific information. And anonymous session-related information could be stored on the user's local browser. Add the fact that our software must be deployed to highly-available environments and we also don't want to store anything in the application on disk.

So though I shouldn't focus on how the code stores data, I need to be aware of what kinds of data should be stored in which places and what can not be stored in certain places.

Research. Design. Implement. Validate. Or Not.

There's an assumption in software development and UX life cycle recommendations that there's a proper order to creating services or solutions.

I often see it presented in an order such as:
  1. Research
  2. Design
  3. Implement
  4. Validate

The MidasRule has shown that it's often not this structured or at least not applicable for every kind of service or deliverable.

As an example, like any organization, we work with a lot of third-party libraries, components, or tools. And this means we need to account for what's easy, hard, or hard for a given control.

When dealing with such third-party elements, it helps when designers use them in a way you can implement or configure with minimal customization. And though this can increase productivity, you will encounter issues specific to the library, component, or tool. Also, reliance on certain tools or technologies impacts the technical skills needed for your teams and occasionally morale.

Sometimes I'll reverse the process, especially when trying to improve an existing solution. I'll bring the validation, challenges, and opportunities from my background in the field as a customer and consultant. Then we'll implement a solution using a well-known or trusted component. Then once we "prove" the concept in a well, proof-of-concept (POC), we'll go ahead and revisit UX-related activities like a usability test and validation.

The assumption that iterating on designs saves time for development doesn't apply all the time. 

Stuff Learned

I'm still learning, applying, and growing. As we get ready for our next big release, I'm contemplating how much it's important to keep my audiences in mind, how lists are ephemeral, and the guidelines and best practices are a place to start. You have to do the work, learn the lessons, and keep adjusting to make a difference. Get started or care enough and Midas Rule your way to work out the right problems to solve with the right people.

No comments:

Post a Comment

Feel free to share your thoughts below.

Some HTML allowed including links such as: <a href="link">link text</a>.