This document discusses using metrics and indicators in Agile development. It advocates starting with questions rather than jumping straight to metrics. Good indicators help answer questions about whether a team is going in the right direction, how soon they will get there, and how fast they can adjust. The document provides examples of learner-focused questions versus judger-focused questions and emphasizes the importance of curiosity.
Report
Share
Report
Share
1 of 27
More Related Content
Similar to Agile Indicators: Start with Questions!
How to Impress, Not Overwhelm your CMO with AnalyticsBonnie Mailey
In this presentation, Jeffalytics’ own Jeff Sauer and Hanapin’s Kristin Vick will provide you with some quick and effective ways to weed out important numbers and present them impactfully to your CMO.
How to Impress, Not Overwhelm your CMO with AnalyticsHanapin Marketing
In this presentation, Jeffalytics’ own Jeff Sauer and Hanapin’s Kristin Vick will provide you with some quick and effective ways to weed out important numbers and present them impactfully to your CMO.
8 Ways to Evaluate Learning S106 Learning DevCamp 2019TorranceLearning
This document discusses strategies for evaluating training programs using the 8 levels of evaluation: 1) Satisfaction, 2) Knowledge, 3) Behaviors, 4) Results, 5) Participation, 6) Learning Experience, 7) Leader Insights, and 8) Lessons Learned. It provides details on how to measure each level, what types of data and tools to use, and recommends establishing a strategy that involves measuring engagement, experience, and organizational insights. The document stresses starting to measure key metrics now to establish baselines and implementing evaluations in 30, 60, and 90 day increments.
Measuring & Evaluating Your DesignOps PracticeDave Malouf
This document discusses measuring and evaluating a DesignOps practice. It begins by defining DesignOps and its goals of amplifying design value and scaling design teams. It then discusses defining design value through skills like storytelling and prototyping. Various pieces of DesignOps like tools, infrastructure, and governance are outlined. Different types of metrics for measuring DesignOps success are proposed, including quantitative and qualitative data. Key questions for evaluating people, workflow, communications, tools, and governance are provided. The document stresses the importance of understanding business goals and creating a vision of success to measure the right things and ensure DesignOps success.
Assessing Your Current DesignOps Practice: A Heuristic Model - Dave MaloufWeb à Québec
Many companies are finding that they are being asked to add a DesignOps practice to their existing design organizations. This is great news, because by adding an operational mindset, and putting intentional design to one’s design operations, only better design will happen, which is the point, eh?
But how can I measure and communicate success? How do I even know what success is? How can I prioritize, and roadmap planning, and growth of my DesignOps practice?
In this lecture, I will propose a system that can be easily deployed and even customized so that as a design leader or a DesignOps leader you can show anyone in your company where you are at and where you are going to be working to mature practice and why.
Top 10 Tips for Smart Software Selection Success Before the SearchWebLink International
Do you dread using your current software? Does it lack key features, integrations and reports that you need to successfully run your organization?
Learn about when it's time to start looking for new software, and how to conduct a successful search.
You will learn:
• How to identify the signs that it's time to make a switch.
• The importance to understanding the foundation you have now, and how to build on it for the future.
• What to ask members, and how their needs should have an influence on your software selection.
• Top questions to ask vendors, and what's even more imporant than the technology itself.
#GrowthDeck - Andrew Chen AMA by GrowthHackersGrowthHackers
GrowthHackers holds weekly AMA (Ask Me Anything) sessions with Growth experts from around the world. Andrew Chen a writer, entrepreneur and investor focused on mobile products, metrics and user growth. He has been an advisor to companies like Angelist, Product Hunt and Dropbox.
Find more great growth resources on GrowthHackers.com. Join the community and have your own questions answered by professionals like Andrew today.
Growthhackers.com/AMA
Even small organizations can create and execute meaningful strategic plans. Creating a well-defined strategy is hard work and not for everyone, as it requires us to begin to say "no" to stuff we usually say "yes" to. You are hereby invited by facilitator Ed Kless, to open a dialogue about how best to go about creating a strategy for your small business organization.
TLS Continuum Guide to Process Improvement -what we don't knowDaniel Bloom
The document discusses process improvement and how to measure it. It introduces the "5 Whys" tool to help identify the root cause of problems by asking "why" five times. It also discusses using an HR audit questionnaire to evaluate an organization's processes. The questionnaire contains inquiries in various areas like management practices, training, process focus, and knowledge gaining. Resolving how to measure requires developing a new mindset, skills, and knowledge around continuous process improvement.
Optimizely Workshop 1: Prioritize your roadmapOptimizely
When your testing roadmap includes dozens of ideas (each with unique requirements) and each team member is vying for her idea to be run first, effective prioritization becomes paramount. This session will focus on the considerations, tools and frameworks you can use to make sure your roadmap is appropriately prioritized to meet your goals.
This document discusses pragmatic approaches to adopting agile practices in an organization. It recommends focusing on business goals and real problems, then making changes incrementally through small experiments and reflection. Adopting communication practices like backlogs and planning sessions first can help align teams and customers. Selling agile requires skill and focusing discussions on problems rather than practices. The order and specifics will depend on each organization's environment and context.
Pulse Surveys - Do They Make Sense - 23jul15TalentMap
More and more, employers say short quarterly, monthly, weekly or even daily polls—sometimes a single question at a time—provide data on how their teams actually feel and catch problems before they fester. Frequent surveys are even replacing annual employee surveys at some companies, but most top employers are starting to use both.
From this presentation of my webinar you will find out about:
Walt Disney Model for creative thinking and effective conversations - Think out of the box, apply your dreams onto practice and mitigate risks
Cartesian Coordinates technique for decision-making - Explore the idea from various sides to achieve better results
Pro-Con Analysis of the situation - What are the benefits and disadvantages?
COIN and STAR feedback models - How to give feedback that people listen to it?
Gradients of Consensus - There are different Yes’s and various No’s
Notes inside! Practical advice for measuring, analyzing, and reporting your n...Sarah M Worthy
This document provides an overview of a presentation on measuring, analyzing, and reporting nonprofit digital data. The presentation covers defining key performance indicators (KPIs) to track an organization's goals and strategies. It also discusses connecting digital marketing tactics to results using analytics tools like Google Analytics. The presenter provides tips on consistently measuring the right metrics over time and avoiding "vanity metrics" that don't provide useful insights.
Data skills for Agile Teams- Killing story pointsyasinnathani
This document discusses alternatives to using story points for estimating work in Agile software development. It argues that story points are not the best way to estimate because they only account for the active work time and not queue times, which can account for up to 90% of the total time. It recommends taking a probabilistic approach and tracking flow metrics like cycle time, work in progress (WIP), and aging WIP to better understand flow and give probabilistic estimates rather than deterministic ones. Getting work into the right size and shape before pulling it in is also important to improve flow.
How To Build Amazing Products Through Customer FeedbackProduct School
User research is important to understand customers and their needs. Some key questions for user research include:
- How do users currently solve the problem we want to address and what do they like/dislike about current solutions? Understanding user behaviors and pain points is important.
- What are users' goals, frustrations, stresses and what excites them? Getting a holistic view of users' experiences provides valuable insights.
- How do competitors' customers perceive and use competitive products? Learning what other companies' customers value helps understand the market.
Delivered at AccountEX in London, this presentation represent my latest thinking on strategy. For a video of the presentation you can visit my person site at http://edkless.com/strategyvideo
If agile is so great, why do we constrain it to software projectsJohn McIntyre
This document discusses applying Agile principles and practices beyond software projects. It provides a timeline of Agile's development and defines key Agile concepts. The document advocates targeting business areas wanting change, training them in Agile techniques, and coaching them to define and measure success through iterations. Visual boards and simple messages are recommended. Adopting Agile ensures continuous optimization of investments. Key points include modeling Agile in PMOs, focusing on willing teams, and coaching the incremental learn-apply process.
Not everything is a nail: choosing the right toolsShahina Patel
Have you experienced the frustration of failure when trying to introduce new tools, use bad software, work around old habits, engage with the latest new technique for retrospectives? If you're fed up with new tool mania then try a different start point - consider your goals and problems instead.
Similar to Agile Indicators: Start with Questions! (20)
Think of a time when you learned a new skill, overcame a challenge, or embarked on a new journey. Looking back, what helped you to just keep going? If I were to guess, you might have thought of words like drive, persistence or tenacity and that might be the case. There's another word to add here: mindset or an agile mindset.
Growing an agile mindset is something that happens over time. Individuals and teams experience it often. In this presentation, we will go over the agile mindset, why growing an agile mindset is essential and explore several ways that could help in growing an agile mindset.
The document discusses 3 essential coaching skills for project leaders: acknowledge, ask good questions, and listen deeply. It defines each skill and provides examples. For acknowledging, it suggests acknowledging the person and situation. For asking questions, it recommends open-ended questions. For listening deeply, it explains different listening modes like listening to understand versus replying. The document encourages practicing these skills to improve conversations and interactions with others.
The document discusses values related to agile communities and practices. It mentions values like curiosity, connection, conversation, compassion, courage, gratitude, playfulness, humor, individuals, interactions, working software, progress, people, conversations, responding to change, resilience, dialogue, collaboration, relationships, and learning. It defines "agile" as marked by ready ability to move with quick easy grace or having a quick resourceful and adaptable character. It asks what value the reader wants to honor that day.
This document outlines a session on mob programming for learning. The session will include a 20 minute lightening talk and presentation, followed by two rounds of mob programming exercises with 4 minute plays and 1 minute debriefs each. It will conclude with a 10 minute debrief and 5 minutes of Q&A. The document discusses how mob programming can help organizations accelerate learning without budget by sharing skills, and explores questions around identifying current skills and introducing mob programming.
Goals driven delivery with impact mapping pmi-march2019sparkagility
This document introduces Impact Mapping, which is a structured visualization technique to map out goals, impacted actors, desired impacts, and necessary deliverables. It explains that Impact Maps answer the questions of "why, who, how and what". Examples of Impact Maps are provided on increasing university talk attendance. The document recommends introducing Impact Mapping to teams by conducting a lunch and learn session to brainstorm its application or using it to connect goals to a current project.
Kicking Off Your First Agile Project @PMIWDC sparkagility
The document discusses kicking off an agile project. It covers identifying elements of agile chartering such as product vision, project mission, core team formation and working agreements. Essential agile practices like backlogs, user stories, sprints, daily stand-ups and retrospectives are introduced. Retrospectives are highlighted as a critical practice for teams to continuously inspect and adapt their processes to uncover better ways of working. The goal of retrospectives is to reflect on what can be improved to become a more effective team.
The document announces a session on mob programming led by Salah Elleithy and Ganesh Murugan. The session will include a 15 minute lightening talk, four 5 minute rounds of mob programming with 1 minute debriefs between each round, and a 5 minute Q&A. The session will explore what mob programming is, how to find skills within an organization, how to introduce mob programming to accelerate learning, and how mob programming can help address challenges of having limited time and budget for learning while needing to deliver quickly and develop new skills.
This document discusses concepts related to organizational agility through a series of tweets and quotes from various sources. It explores definitions of agility, the key ingredients to enabling agility, agile mindsets, principles of agile and DevOps practices like continuous delivery. Overall it provides an overview of the essential elements needed to build an agile organization, including a focus on customers, collaboration, learning and adapting to change.
The document discusses three keys to self-direction and leadership: intention, awareness, and confrontation. Intention involves wanting something and believing it is achievable. Awareness refers to being present and mindful. Confrontation means facing realities to grow and learn the truth. The document provides tips for developing each key, such as setting goals, practicing mindfulness, and examining biases. Leaders take responsibility and mobilize resources to address problems or opportunities. Self-efficacy also influences leadership, as people with high self-efficacy view challenges as opportunities.
Government PO What to expect when they are expectingsparkagility
The document discusses the product owner role in an agile context, particularly for government organizations. It describes the traditional product owner responsibilities of managing the product backlog. However, it notes that the role may need to evolve for government settings, where a "value team" approach with multiple stakeholders facilitating decision making could be more effective than a single product owner. The document explores how organizational structures and value team configurations may need to adapt for government agencies.
Agile2015 - Our Business Pipeline is Brokensparkagility
The document discusses issues with business delivery pipelines and proposes solutions. It notes that pipelines are often broken, with requests coming from many sources and long chains leading to lost information. Common pitfalls include multiple input queues pushing decisions to development and top executives bypassing processes. The document recommends agreeing on a single input process, deciding priorities before development, and collaborating across handoffs with techniques like "three amigos". Explicitly mapping the pipeline and removing bottlenecks can help improve business delivery.
The document discusses organizational change in an agile world. It provides perspectives on managing resistance to change by understanding the reasons for reactions to change. It also outlines approaches for executing change, including establishing a vision and direction, building skills and incentives, creating plans and tracking progress, and using improvement approaches like the Toyota Kata.
ReadyforAgile Webinar hosted by ICAgilesparkagility
The document discusses assessing organizational agility. It begins by noting the importance of starting an agility assessment by understanding where the organization currently stands and where it wants to go. It then outlines a 5-step systematic approach to agility assessment: 1) define purpose and outcomes, 2) identify participants and demographics, 3) define indicators, 4) gather data, and 5) share insights. Key insights involve identifying characteristics that enable or sustain agility, such as overcommitment, trust between silos, and mindset of value-driven delivery. Role-specific responses help identify opportunities like low trust between departments.
The document discusses the importance of learning, reading books, and building connections with others. It emphasizes that one must care about making positive changes, and that learning is an ongoing process of learning, experimenting, teaching, and examining through building relationships and engaging with a community. The last lines encourage learning to apply knowledge quickly through an agile approach.
The document provides an agenda for a training session on agile basics. The summary is:
The training session will cover key topics such as defining agility, comparing traditional and agile approaches, explaining the origins of agile and the agile manifesto. It will help participants understand the agile mindset and recognize the difference between doing agile and being agile. The session will also explore challenges to enabling agility and techniques for continuous improvement.
The document provides guidance on using a 4Cs training map approach to design effective training sessions that incorporate accelerated learning principles. It outlines the 4Cs - Connections, Concepts, Concrete Practice, and Conclusions - as an instructional design tool to show learners where the instruction is going and get them there together through multisensory activities. Examples of learner activities are provided for each C, like graphic organizers, teach-backs, and action planning. The goal is to actively engage learners through movement, images, writing and different approaches rather than passive activities like long lectures.
This document discusses how to design productive meetings. It provides tips for meeting facilitators on identifying the right participants, setting ground rules, creating an agenda with clear objectives and outcomes, tracking action items, following up after meetings, and ensuring participation from all attendees. The goal is to avoid unnecessary meetings and maximize productivity by focusing discussions and following through on agreed upon next steps.
This document provides an overview of a two-day PMI-ACP exam prep course. It outlines the course agenda, including introductions, an overview of the PMI-ACP exam requirements, and references. The exam requirements section specifies the experience and training needed to sit for the PMI-ACP exam, including 2000 hours of general project experience, 1500 hours of agile experience, and 21 hours of agile training. The document also notes that the exam will test knowledge of agile fundamentals and tools/techniques.
This document provides an introduction and agenda for an Agile Basics training session. It includes information about the trainer, Salah Elleithy, including his qualifications and experience in Agile coaching. The learning objectives are outlined, which focus on understanding what makes agility essential, the agile mindset, and the difference between doing agile and being agile. The agenda covers topics such as defining agility, the origins of Agile and the Agile Manifesto, challenges to enabling agility, stages of learning, and understanding the agile mindset. Logistics and ground rules for participation are also mentioned.
This one day course covers fundamentals of agile. The course will explore the origins and history of agile, understand the agile mindset, and learn techniques for planning, estimation, tracking progress, and adapting processes. The instructor has over 15 years of experience in areas like business analysis, project management, agile coaching, and is certified in several agile frameworks. The course will help participants apply agile beyond software development and establish an agile mindset focused on continuous learning, feedback, and improvement.
Video traffic on the Internet is constantly growing; networked multimedia applications consume a predominant share of the available Internet bandwidth. A major technical breakthrough and enabler in multimedia systems research and of industrial networked multimedia services certainly was the HTTP Adaptive Streaming (HAS) technique. This resulted in the standardization of MPEG Dynamic Adaptive Streaming over HTTP (MPEG-DASH) which, together with HTTP Live Streaming (HLS), is widely used for multimedia delivery in today’s networks. Existing challenges in multimedia systems research deal with the trade-off between (i) the ever-increasing content complexity, (ii) various requirements with respect to time (most importantly, latency), and (iii) quality of experience (QoE). Optimizing towards one aspect usually negatively impacts at least one of the other two aspects if not both. This situation sets the stage for our research work in the ATHENA Christian Doppler (CD) Laboratory (Adaptive Streaming over HTTP and Emerging Networked Multimedia Services; https://athena.itec.aau.at/), jointly funded by public sources and industry. In this talk, we will present selected novel approaches and research results of the first year of the ATHENA CD Lab’s operation. We will highlight HAS-related research on (i) multimedia content provisioning (machine learning for video encoding); (ii) multimedia content delivery (support of edge processing and virtualized network functions for video networking); (iii) multimedia content consumption and end-to-end aspects (player-triggered segment retransmissions to improve video playout quality); and (iv) novel QoE investigations (adaptive point cloud streaming). We will also put the work into the context of international multimedia systems research.
Scaling Connections in PostgreSQL Postgres Bangalore(PGBLR) Meetup-2 - MydbopsMydbops
This presentation, delivered at the Postgres Bangalore (PGBLR) Meetup-2 on June 29th, 2024, dives deep into connection pooling for PostgreSQL databases. Aakash M, a PostgreSQL Tech Lead at Mydbops, explores the challenges of managing numerous connections and explains how connection pooling optimizes performance and resource utilization.
Key Takeaways:
* Understand why connection pooling is essential for high-traffic applications
* Explore various connection poolers available for PostgreSQL, including pgbouncer
* Learn the configuration options and functionalities of pgbouncer
* Discover best practices for monitoring and troubleshooting connection pooling setups
* Gain insights into real-world use cases and considerations for production environments
This presentation is ideal for:
* Database administrators (DBAs)
* Developers working with PostgreSQL
* DevOps engineers
* Anyone interested in optimizing PostgreSQL performance
Contact info@mydbops.com for PostgreSQL Managed, Consulting and Remote DBA Services
Navigating Post-Quantum Blockchain: Resilient Cryptography in Quantum Threatsanupriti
In the rapidly evolving landscape of blockchain technology, the advent of quantum computing poses unprecedented challenges to traditional cryptographic methods. As quantum computing capabilities advance, the vulnerabilities of current cryptographic standards become increasingly apparent.
This presentation, "Navigating Post-Quantum Blockchain: Resilient Cryptography in Quantum Threats," explores the intersection of blockchain technology and quantum computing. It delves into the urgent need for resilient cryptographic solutions that can withstand the computational power of quantum adversaries.
Key topics covered include:
An overview of quantum computing and its implications for blockchain security.
Current cryptographic standards and their vulnerabilities in the face of quantum threats.
Emerging post-quantum cryptographic algorithms and their applicability to blockchain systems.
Case studies and real-world implications of quantum-resistant blockchain implementations.
Strategies for integrating post-quantum cryptography into existing blockchain frameworks.
Join us as we navigate the complexities of securing blockchain networks in a quantum-enabled future. Gain insights into the latest advancements and best practices for safeguarding data integrity and privacy in the era of quantum threats.
Interaction Latency: Square's User-Centric Mobile Performance MetricScyllaDB
Mobile performance metrics often take inspiration from the backend world and measure resource usage (CPU usage, memory usage, etc) and workload durations (how long a piece of code takes to run).
However, mobile apps are used by humans and the app performance directly impacts their experience, so we should primarily track user-centric mobile performance metrics. Following the lead of tech giants, the mobile industry at large is now adopting the tracking of app launch time and smoothness (jank during motion).
At Square, our customers spend most of their time in the app long after it's launched, and they don't scroll much, so app launch time and smoothness aren't critical metrics. What should we track instead?
This talk will introduce you to Interaction Latency, a user-centric mobile performance metric inspired from the Web Vital metric Interaction to Next Paint"" (web.dev/inp). We'll go over why apps need to track this, how to properly implement its tracking (it's tricky!), how to aggregate this metric and what thresholds you should target.
An invited talk given by Mark Billinghurst on Research Directions for Cross Reality Interfaces. This was given on July 2nd 2024 as part of the 2024 Summer School on Cross Reality in Hagenberg, Austria (July 1st - 7th)
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...Chris Swan
Have you noticed the OpenSSF Scorecard badges on the official Dart and Flutter repos? It's Google's way of showing that they care about security. Practices such as pinning dependencies, branch protection, required reviews, continuous integration tests etc. are measured to provide a score and accompanying badge.
You can do the same for your projects, and this presentation will show you how, with an emphasis on the unique challenges that come up when working with Dart and Flutter.
The session will provide a walkthrough of the steps involved in securing a first repository, and then what it takes to repeat that process across an organization with multiple repos. It will also look at the ongoing maintenance involved once scorecards have been implemented, and how aspects of that maintenance can be better automated to minimize toil.
UiPath Community Day Kraków: Devs4Devs ConferenceUiPathCommunity
We are honored to launch and host this event for our UiPath Polish Community, with the help of our partners - Proservartner!
We certainly hope we have managed to spike your interest in the subjects to be presented and the incredible networking opportunities at hand, too!
Check out our proposed agenda below 👇👇
08:30 ☕ Welcome coffee (30')
09:00 Opening note/ Intro to UiPath Community (10')
Cristina Vidu, Global Manager, Marketing Community @UiPath
Dawid Kot, Digital Transformation Lead @Proservartner
09:10 Cloud migration - Proservartner & DOVISTA case study (30')
Marcin Drozdowski, Automation CoE Manager @DOVISTA
Pawel Kamiński, RPA developer @DOVISTA
Mikolaj Zielinski, UiPath MVP, Senior Solutions Engineer @Proservartner
09:40 From bottlenecks to breakthroughs: Citizen Development in action (25')
Pawel Poplawski, Director, Improvement and Automation @McCormick & Company
Michał Cieślak, Senior Manager, Automation Programs @McCormick & Company
10:05 Next-level bots: API integration in UiPath Studio (30')
Mikolaj Zielinski, UiPath MVP, Senior Solutions Engineer @Proservartner
10:35 ☕ Coffee Break (15')
10:50 Document Understanding with my RPA Companion (45')
Ewa Gruszka, Enterprise Sales Specialist, AI & ML @UiPath
11:35 Power up your Robots: GenAI and GPT in REFramework (45')
Krzysztof Karaszewski, Global RPA Product Manager
12:20 🍕 Lunch Break (1hr)
13:20 From Concept to Quality: UiPath Test Suite for AI-powered Knowledge Bots (30')
Kamil Miśko, UiPath MVP, Senior RPA Developer @Zurich Insurance
13:50 Communications Mining - focus on AI capabilities (30')
Thomasz Wierzbicki, Business Analyst @Office Samurai
14:20 Polish MVP panel: Insights on MVP award achievements and career profiling
Hire a private investigator to get cell phone recordsHackersList
Learn what private investigators can legally do to obtain cell phone records and track phones, plus ethical considerations and alternatives for addressing privacy concerns.
INDIAN AIR FORCE FIGHTER PLANES LIST.pdfjackson110191
These fighter aircraft have uses outside of traditional combat situations. They are essential in defending India's territorial integrity, averting dangers, and delivering aid to those in need during natural calamities. Additionally, the IAF improves its interoperability and fortifies international military alliances by working together and conducting joint exercises with other air forces.
Details of description part II: Describing images in practice - Tech Forum 2024BookNet Canada
This presentation explores the practical application of image description techniques. Familiar guidelines will be demonstrated in practice, and descriptions will be developed “live”! If you have learned a lot about the theory of image description techniques but want to feel more confident putting them into practice, this is the presentation for you. There will be useful, actionable information for everyone, whether you are working with authors, colleagues, alone, or leveraging AI as a collaborator.
Link to presentation recording and transcript: https://bnctechforum.ca/sessions/details-of-description-part-ii-describing-images-in-practice/
Presented by BookNet Canada on June 25, 2024, with support from the Department of Canadian Heritage.
What Not to Document and Why_ (North Bay Python 2024)Margaret Fero
We’re hopefully all on board with writing documentation for our projects. However, especially with the rise of supply-chain attacks, there are some aspects of our projects that we really shouldn’t document, and should instead remediate as vulnerabilities. If we do document these aspects of a project, it may help someone compromise the project itself or our users. In this talk, you will learn why some aspects of documentation may help attackers more than users, how to recognize those aspects in your own projects, and what to do when you encounter such an issue.
These are slides as presented at North Bay Python 2024, with one minor modification to add the URL of a tweet screenshotted in the presentation.
How to Avoid Learning the Linux-Kernel Memory ModelScyllaDB
The Linux-kernel memory model (LKMM) is a powerful tool for developing highly concurrent Linux-kernel code, but it also has a steep learning curve. Wouldn't it be great to get most of LKMM's benefits without the learning curve?
This talk will describe how to do exactly that by using the standard Linux-kernel APIs (locking, reference counting, RCU) along with a simple rules of thumb, thus gaining most of LKMM's power with less learning. And the full LKMM is always there when you need it!
8. @selleithy
#AgileIndicators
LinkedIn.com/in/selleithy
Agenda
● The problem with metrics
○ Why do some love metrics and others hate them?
● Why do we need metrics?
● A different perspective
● What are Agile indicators?
○ Are we going in the right direction?
○ How fast are we going?
■ (Underlying question: When are going to get there?)
● Start with Questions!
● Staying Curious!
9. @selleithy
#AgileIndicators
LinkedIn.com/in/selleithy
I love metrics because...
I hate metrics because...
Love / Hate relationship with metrics...
I constantly feel evaluated or judged
they convey that we are not good enough
(managers usually look for the negative!)
they can help us learn and improve...
they could highlight bottlenecks
they offer some level of confidence
they don’t tell the whole story
they almost always miss the context!
managers use them to measure productivity
I can use them to forecast what might happen
I can use them to tell whether our project will
be on budget, time, scope, etc.
they usually confirm our biases!
they can easily be gamed!
10. @selleithy
#AgileIndicators
LinkedIn.com/in/selleithy
Metrics translator...
When managers or leaders ask... Team hears...
Why are you not going faster? What have you been busy doing?
Can you get this issue resolved? Can you give me an answer now?
What is your velocity? (story points delivered per sprint) How busy are you?
When are you going to deliver this feature? (Date) Are you sure you are going to be on time?
What is your estimate? Can you deliver by this date?
14. @selleithy
#AgileIndicators
LinkedIn.com/in/selleithy
In reality, it looks like this...
Structure / Data
What structure support these
indicators?
What data do we currently have?
What data do we need to have?
Question (Start here!)
What questions do we need
to answer?
Indicator
What indicators help us answer
these questions?
Agile Indicators
Start with Questions!
15. @selleithy
#AgileIndicators
LinkedIn.com/in/selleithy
What are Agile Indicators?
Based on our observation and experience, Agile
indicators boil down to 3 main questions:
● Are we going in the right direction?
(Compass)
● How soon will we get there? (Speed)
● How fast can we adjust? (Response)
17. @selleithy
#AgileIndicators
LinkedIn.com/in/selleithy
Imagine if...
We aligned on the questions and the indicators with the teams from the get go!
“For instance, at Amazon, metrics are established in advance of
every activity and specify what actions are expected to happen in
ways that can be measured in real-time. If the metrics show that
activity is not having the impact that was expected, action is
required.”
-Stephen Denning. Forbes: Why Agile
Often Fails: No Agreed Metrics
20. @selleithy
#AgileIndicators
LinkedIn.com/in/selleithy
What questions come to your mind?
Team A
Team B
How is team A getting
more work done than
team B?
What’s the team
velocity?
How many stories are
we completing every
sprint?
Where are we seeing
bottlenecks?
What can team A learn from
team B and vice versa?
Where do we see opportunities
for improvement?
21. @selleithy
#AgileIndicators
LinkedIn.com/in/selleithy
What questions come to your mind?
1 2 3 4 5 6 7 8 9 10
When can we deliver all
stories?
When can we deliver all
stories?
What is our “release” goal?
How likely are we going to
meet our goal (based on this
chart/trend)?
What can we deliver by
Sprint 11?
What cause the scope
change?
What are we noticing? What
insights can we take from
this?
Can we deliver all stories by
Sprint 12?
Most teams hate the idea of tracking or sharing metrics with managers or leaders. When the teams are asked to share metrics like velocity, burn down and other output driven metrics, teams hear what are you "busy" doing? So, the team starts to focus on showing how busy they are which drives the wrong behavior and stifles their opportunity for learning and growing and using metrics for good.
Since, the essence of agility is continuous improvement via inspect and adapt. We want to reframe the conversation around Agile indicators that start with questions! What questions are you asking your team? How are you helping them to learn and grow? What indicators are you looking at?
In this session, we will be providing insights around using Agile indications with questions in order to embrace a different mindset. A mindset that encourages more learning, growing and less judgment. A curiosity mindset that encourages organizations to move from “busy work” or output focused metrics to outcomes focused using questions.
Use a piece of paper and split into half (write down measure what they want in one half of the paper and write down measure what they can in the other half of the paper (What do managers (or leaders) want to measure? Vs. what can managers (or leaders) actually measure?
Turn to the person next to you
The problem with metrics
Why do some love metrics and others hate them?
Why do we need metrics?
A different perspective
A different perspective
A different perspective...
A different perspective...
A different perspective...
What are Agile indicators?
Are we going in the right direction?
How fast are we going?
(Underlying question: When are going to get there?)
Staying Curious!
Understand the difference between the learner mindset vs. the judger mindset
Adopt a curious/learners mindset using choice maps
Possibly introduce Q-storming / Q-prep
Understand the difference between the learner mindset vs. the judger mindset
Adopt a curious/learners mindset using choice maps
Opinion: Use the questions from earlier slide
Possibly introduce Intent-based leadership by David Marquet
Judger Questions
Why is team A getting more work done than team B?
What’s the team velocity?
How many stories are we completing every sprint?
Learner Questions
What can team A learn from team B and vice versa?
Where are we seeing bottlenecks?
Where do we see opportunities for improvement?
These stories are for a specific feature and these teams are working on more than 1 feature any given sprint.
When can we deliver all stories? (Fixed scope)
What can we deliver by Sprint 12? (Fixed Date)
Can we deliver all stories by Sprint 12? (Fixed Scope and Date)
Learner Questions
What is our “release” goal?
How likely are we going to meet our goal (based on this chart/trend)?
What are we noticing? What insights can we take from this?
What is our biggest takeaway here?
What cause the scope change?
If we think of our stories as experiments (because we don’t know what we don’t know yet).
Story about a lead saying that the team is horrible and failed our sprint (basing it on one data point)
Judger questions:
Why are we not meeting our commitment? (most teams will commit to less work so they can meet their commitment so that they can look good.)
Learner questions
What obstacles are we running into?
How long does it usually take to resolve these obstacles?
What are the common obstacles we run into?