This presentation describes the process of producing rich product requirements by collaborative game play. It also discusses the phases of product innovation in both waterfall and agile environments.
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
DAS Slides: Building a Data Strategy — Practical Steps for Aligning with Busi...DATAVERSITY
Developing a Data Strategy for your organization can seem like a daunting task. The opportunity in getting it right can be significant, however, as data drives many of the key initiatives in today’s marketplace from digital transformation, to marketing, to customer centricity, population health, and more. This webinar will help de-mystify data strategy and data architecture and will provide concrete, practical ways to get started.
The document discusses data governance and outlines several key points:
1) Many organizations have little or no focus on data governance, though most CIOs plan to implement enterprise-wide data governance in the next three years.
2) Data governance refers to the overall management of availability, usability, integrity and security of enterprise data.
3) Effective data governance requires policies, processes, business rules, roles and responsibilities, and technologies to be successfully implemented.
Data-Ed Webinar: Data Quality Success StoriesDATAVERSITY
Organizations must realize what it means to utilize data quality management in support of business strategy. This webinar will demonstrate how chronic business challenges can often be attributed to the root problem of poor data quality. Showing how data quality should be engineered provides a useful framework in which to develop an effective approach. Establishing this framework allows organizations to more efficiently identify business and data problems caused by structural issues versus practice-oriented defects; giving them the skillset to prevent these problems from re-occurring.
Learning Objectives:
Understanding foundational data quality concepts based on the DAMA DMBOK
Utilizing data quality engineering in support of business strategy
Case Studies illustrating data quality success
Data quality guiding principles & best practices
Steps for improving data quality at your organization
RWDG Slides: What is a Data Steward to do?DATAVERSITY
Most people recognize that Data Stewards play an essential role in their Data Governance and Information Governance programs. However, the manner in which Data Stewards are used is not the same from organization to organization. How you use Data Stewards depends on your goals for Data Governance.
Join Bob Seiner for this month’s RWDG webinar where he will share different ways to activate Data Stewards based on the purpose of your program. Bob will talk about options to extend existing Data Steward activity and how to build new functionality into the role of your Data Stewards.
In this webinar, Bob will discuss:
- The crucial role of the Data Steward in Data Governance
- Different types of Data Stewards and what they do
- Aligning Data Steward activities with program goals
- Improving existing Data Steward actions
- Finding new ways to use your Data Stewards
How to identify the correct Master Data subject areas & tooling for your MDM...Christopher Bradley
1. What are the different Master Data Management (MDM) architectures?
2. How can you identify the correct Master Data subject areas & tooling for your MDM initiative?
3. A reference architecture for MDM.
4. Selection criteria for MDM tooling.
chris.bradley@dmadvisors.co.uk
This document discusses the development of a data strategy for an organization. It begins by introducing the presenter and organization. It then covers why a data strategy is needed to address common data issues. The strategy should define what the data team will and will not do. Developing the strategy requires gathering information, consulting other teams, and linking it to the organization's mission. Key aspects of the strategy include objectives, principles, delivery areas, and ensuring it is concise enough to be accessible and remembered.
Data Governance Takes a Village (So Why is Everyone Hiding?)DATAVERSITY
Data governance represents both an obstacle and opportunity for enterprises everywhere. And many individuals may hesitate to embrace the change. Yet if led well, a governance initiative has the potential to launch a data community that drives innovation and data-driven decision-making for the wider business. (And yes, it can even be fun!). So how do you build a roadmap to success?
This session will gather four governance experts, including Mary Williams, Associate Director, Enterprise Data Governance at Exact Sciences, and Bob Seiner, author of Non-Invasive Data Governance, for a roundtable discussion about the challenges and opportunities of leading a governance initiative that people embrace. Join this webinar to learn:
- How to build an internal case for data governance and a data catalog
- Tips for picking a use case that builds confidence in your program
- How to mature your program and build your data community
Driving Data Intelligence in the Supply Chain Through the Data Catalog at TJXDATAVERSITY
Roles and responsibilities are a critical component of every Data Governance program. Building a set of roles that are practical and that will not interfere with people’s “day jobs” is an important consideration that will influence how well your program is adopted. This tutorial focuses on sharing a proven model guaranteed to represent your organization.
Join Bob Seiner for this lively webinar where he will dissect a complete Operating Model of Roles and Responsibilities that encompasses all levels of the organization. Seiner will detail the roles and describe the most effective way to associate people with the roles. You will walk out of this webinar with a model to apply to your organization.
In this session Bob will share:
- The five levels of Data Governance roles
- A proven Operating Model of Roles and Responsibilities
- How to customize the model to meet your requirements
- Setting appropriate role expectations
- How to operationalize the roles and demonstrate value
How to Strengthen Enterprise Data Governance with Data QualityDATAVERSITY
If your organization is in a highly-regulated industry – or relies on data for competitive advantage – data governance is undoubtedly a top priority. Whether you’re focused on “defensive” data governance (supporting regulatory compliance and risk management) or “offensive” data governance (extracting the maximum value from your data assets, and minimizing the cost of bad data), data quality plays a critical role in ensuring success.
Join our webinar to learn how enterprise data quality drives stronger data governance, including:
The overlaps between data governance and data quality
The “data” dependencies of data governance – and how data quality addresses them
Key considerations for deploying data quality for data governance
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
Data Marketplace and the Role of Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3IS9sQS
A data marketplace is like an online shopping interface specializing in data. Ideally, it should work just like an online store, with minimal latency and maximum responsiveness. However, this does not mean that all of the data in the data marketplace needs to be stored in the same central repository.
In this session, Shadab Hussain, Americas Sales Head, Data Analytics at Wipro, a partner company with Denodo and a co-sponsor of DataFest 2021, talks about the role of data virtualization in enabling full-featured data marketplaces. Such data marketplaces provide real-time, curated access to data, even when the data is stored across many different sources throughout the organization.
You will learn:
- The main features of a data marketplace
- Why organizations need data marketplaces
- Why data marketplaces sometimes fail
- How data virtualization enables the most effective data marketplaces
- How one of Europe’s premiere public healthcare system organizations leveraged a data marketplace to improve data consumption and ease of access
The document discusses data governance and why it is an imperative activity. It provides a historical perspective on data governance, noting that as data became more complex and valuable, the need for formal governance increased. The document outlines some key concepts for a successful data governance program, including having clearly defined policies covering data assets and processes, and establishing a strong culture that values data. It argues that proper data governance is now critical to business success in the same way as other core functions like finance.
Data Modeling, Data Governance, & Data QualityDATAVERSITY
Data Governance is often referred to as the people, processes, and policies around data and information, and these aspects are critical to the success of any data governance implementation. But just as critical is the technical infrastructure that supports the diverse data environments that run the business. Data models can be the critical link between business definitions and rules and the technical data systems that support them. Without the valuable metadata these models provide, data governance often lacks the “teeth” to be applied in operational and reporting systems.
Join Donna Burbank and her guest, Nigel Turner, as they discuss how data models & metadata-driven data governance can be applied in your organization in order to achieve improved data quality.
The document provides an introduction to Christopher Bradley and his experience in information management, along with a list of his recent presentations and publications. It then outlines that the remainder of the document will discuss approaches to selecting data modelling tools, an evaluation method, vendors and products, and provide a summary.
Emerging Trends in Data Architecture – What’s the Next Big Thing?DATAVERSITY
With technological innovation and change occurring at an ever-increasing rate, it’s hard to keep track of what’s hype and what can provide practical value for your organization. Join this webinar to see the results of a recent DATAVERSITY survey on emerging trends in Data Architecture, along with practical commentary and advice from industry expert Donna Burbank.
Good systems development often depends on multiple data management disciplines. One of these is metadata. While much of the discussion around metadata focuses on understanding metadata itself along with associated technologies, this comprehensive issue often represents a typical tool-and-technology focus, which has not achieved significant results. A more relevant question when considering pockets of metadata is whether to include them in the scope of organizational metadata practices. By understanding metadata practices, you can begin to build systems that allow you to exercise sophisticated data management techniques and support business initiatives.
Learning Objectives:
How to leverage metadata in support of your business strategy
Understanding foundational metadata concepts based on the DAMA DMBOK
Guiding principles & lessons learned
Glossaries, Dictionaries, and Catalogs Result in Data GovernanceDATAVERSITY
Data catalogs, business glossaries, and data dictionaries house metadata that is important to your organization’s governance of data. People in your organization need to be engaged in leveraging the tools, understanding the data that is available, who is responsible for the data, and knowing how to get their hands on the data to perform their job function. The metadata will not govern itself.
Join Bob Seiner for the webinar where he will discuss how glossaries, dictionaries, and catalogs can result in effective Data Governance. People must have confidence in the metadata associated with the data that you need them to trust. Therefore, the metadata in your data catalog, business glossary, and data dictionary must result in governed data. Learn how glossaries, dictionaries, and catalogs can result in Data Governance in this webinar.
Bob will discuss the following subjects in this webinar:
- Successful Data Governance relies on value from very important tools
- What it means to govern your data catalog, business glossary, and data dictionary
- Why governing the metadata in these tools is important
- The roles necessary to govern these tools
- Governance expected from metadata in catalogs, glossaries, and dictionaries
Too often I hear the question “Can you help me with our Data Strategy?” Unfortunately, for most, this is the wrong request because it focuses on the least valuable component – the Data Strategy itself. A more useful request is this: “Can you help me apply data strategically?”Yes, at early maturity phases the process of developing strategic thinking about data is more important than the actual product! Trying to write a good (must less perfect) Data Strategy on the first attempt is generally not productive –particularly given the widespread acceptance of Mike Tyson’s truism: “Everybody has a plan until they get punched in the face.” Refocus on learning how to iteratively improve the way data is strategically applied. This will permit data-based strategy components to keep up with agile, evolving organizational strategies. This approach can also contribute to three primary organizational data goals.
In this webinar, you will learn how improving your organization’s data, the way your people use data, and the way your people use data to achieve your organizational strategy will help in ways never imagined. Data are your sole non-depletable, non-degradable, durable strategic assets, and they are pervasively shared across every organizational area. Addressing existing challenges programmatically includes overcoming necessary but insufficient prerequisites and developing a disciplined, repeatable means of improving business objectives. This process (based on the theory of constraints) is where the strategic data work really occurs, as organizations identify prioritized areas where better assets, literacy, and support (Data Strategy components) can help an organization better achieve specific strategic objectives. Then the process becomes lather, rinse, and repeat. Several complementary concepts are also covered, including:
- A cohesive argument for why Data Strategy is necessary for effective Data Governance
- An overview of prerequisites for effective strategic use of Data Strategy, as well as common pitfalls
- A repeatable process for identifying and removing data constraints
- The importance of balancing business operation and innovation
Rapid prototyping allows companies to tweak IoT solutions before fully developing products. It enables getting customer feedback to refine solutions and identify requirements. Rapid prototyping is low risk and high reward as it does not require expensive hardware or extensive commitments, but can lead to successful deployments through thorough planning.
How to leverage your work with a Product Mindset - Mark Opanasiuk.pdfMark Opanasiuk
How to leverage your work with a Product Mindset - Mark Opanasiuk
1. What is a Product Mindset?
2. Product Thinking Mindset on Personal level.
3. Product Mindset on Organization level.
#FIRMday London 28/04/16 - Cubiks 'High Impact Sifting Solutions'Emma Mirrington
Cubiks discuss solutions using client case studies to illustrate how you can attract, engage and match the best talent for your organisation. How you can drive the efficiency and streamline the costs of your recruitment processes whilst engaging candidates through innovative, predictive and data driven solutions
Key Success Factors in New Product EffortsAtul Setlur
What makes product efforts successful? Is it chance or is there a discipline? There is a discipline here. Learn the six key factors to developing products successfully.
I presented these slides at Product Management & Innovation Event 2016 (http://www.gan-events.com/m145/)
The document discusses concepts related to product ownership and agile product development. It defines key differences between agile projects and products, and emphasizes measuring success based on customer and user satisfaction rather than output. It also covers developing a product vision, creating user personas, prioritizing work using risk-value-tail analysis, developing a minimum viable product, visualizing usage flows, and planning releases using a story map. The overall message is on focusing development on delivering value to customers and users through an iterative process.
Design Sprints side-by-side service design sprints vs google venture sprintsAdilson Chicória
Have presented this Design Sprint comparison on 2015 at the Business Analysis Track at Developer's Conference
http://www.thedevelopersconference.com.br/tdc/2015/portoalegre/trilha-analise-de-negocios
It's based on MVS Model/The Service Startup by Tenny Pinheiro and servicedesignsprint.com and Google Ventures Design Sprint before the publication of the book Design Sprint by Jack Knapp .
Unfortunately I have missed to publish it in 2015 and I haven't updated it since then.
The subject have gained traction last year so will serve more like a back tracking understand about who and how people had been using sprint for design prior the buzz .
The document discusses design for manufacturability (DFM) principles and processes. It defines DFM as determining a product's true manufacturing costs early in the design process. The key benefits of DFM include speed to market, improved efficiency, and reduced costs. The document outlines the DFM process, which involves conceptualization, analysis, and redesign to optimize a product for manufacturability. It also discusses key DFM principles such as minimizing parts, standardizing components, and creating modular assemblies.
How to implement research, ideation, prototyping, user testing in agile development process?
How to scale product design process?
What do product manager and product owner do?
Casro Presentation Project And Change Management 1st June 2011sam_inamdar
This presentation shares experiences from a collaborative approach to creating innovative solutions through technology, including insights on management of a technology project life cycle; using tracking tools for change management; managing communications for a virtual team; and other means to
foster collaboration.
Book club INSPIRED How To Create Tech Products Customers LoveSEB
This document discusses best practices for product development. It covers four main areas: product, people, process, and culture. For product, it emphasizes the importance of a compelling long-term vision and strategy aligned with business goals. For people, it discusses assembling cross-functional product teams with dedicated product managers and designers. For process, it advocates for separating product discovery and delivery, with discovery focused on validating ideas through prototyping and customer testing. Finally, for culture it notes the need for experimentation, empowerment, and a customer-centric mindset to foster innovation.
This document discusses moving from an idea to building a minimum viable product (MVP). It notes that at the idea stage, uncertainty is high but the cost of changes is low. Methodologies like design thinking and sprint processes are recommended to iteratively test hypotheses with users through activities like interviews, prototyping, and testing to help validate the idea and prioritize features before developing an MVP. The goal is to use a fast, cheap iterative process to de-risk the idea and better understand user needs and the solution before committing significant resources to building the full product.
Building & launching mobile & digital productsAnurag Jain
These slides are an introduction to Product Management for building & launching mobile & digital products for consumers. It covers the basics of Product Management as well as gives an overview of the Product Management process and a practical, iterative approach to building products.
How to Evaluate Solutions and Build your Evaluation CommitteeBlytheco
In the fourth installment of the series "Are You Ready for Replatforming?", we take a look at a formalized process for creating criteria and steps for making an ERP or CRM solution transition, including who should be involved in the process and how they should participate.
The document provides an overview of agile methods and approaches for software development. It discusses why agility is needed given rapidly changing business environments. Traditional sequential approaches are compared to iterative agile approaches. Specific agile frameworks like Scrum, Extreme Programming (XP), Kanban, and Lean-Agile are described. Benefits of agile include increased business value, reduced risk and uncertainty, and ability to respond to changing customer needs. The document provides details on how each framework works and when each is best applied.
MYIR Product Brochure - A Global Provider of Embedded SOMs & SolutionsLinda Zhang
This brochure gives introduction of MYIR Electronics company and MYIR's products and services.
MYIR Electronics Limited (MYIR for short), established in 2011, is a global provider of embedded System-On-Modules (SOMs) and
comprehensive solutions based on various architectures such as ARM, FPGA, RISC-V, and AI. We cater to customers' needs for large-scale production, offering customized design, industry-specific application solutions, and one-stop OEM services.
MYIR, recognized as a national high-tech enterprise, is also listed among the "Specialized
and Special new" Enterprises in Shenzhen, China. Our core belief is that "Our success stems from our customers' success" and embraces the philosophy
of "Make Your Idea Real, then My Idea Realizing!"
The DealBook is our annual overview of the Ukrainian tech investment industry. This edition comprehensively covers the full year 2023 and the first deals of 2024.
Quality Patents: Patents That Stand the Test of TimeAurora Consulting
Is your patent a vanity piece of paper for your office wall? Or is it a reliable, defendable, assertable, property right? The difference is often quality.
Is your patent simply a transactional cost and a large pile of legal bills for your startup? Or is it a leverageable asset worthy of attracting precious investment dollars, worth its cost in multiples of valuation? The difference is often quality.
Is your patent application only good enough to get through the examination process? Or has it been crafted to stand the tests of time and varied audiences if you later need to assert that document against an infringer, find yourself litigating with it in an Article 3 Court at the hands of a judge and jury, God forbid, end up having to defend its validity at the PTAB, or even needing to use it to block pirated imports at the International Trade Commission? The difference is often quality.
Quality will be our focus for a good chunk of the remainder of this season. What goes into a quality patent, and where possible, how do you get it without breaking the bank?
** Episode Overview **
In this first episode of our quality series, Kristen Hansen and the panel discuss:
⦿ What do we mean when we say patent quality?
⦿ Why is patent quality important?
⦿ How to balance quality and budget
⦿ The importance of searching, continuations, and draftsperson domain expertise
⦿ Very practical tips, tricks, examples, and Kristen’s Musts for drafting quality applications
https://www.aurorapatents.com/patently-strategic-podcast.html
How to Avoid Learning the Linux-Kernel Memory ModelScyllaDB
The Linux-kernel memory model (LKMM) is a powerful tool for developing highly concurrent Linux-kernel code, but it also has a steep learning curve. Wouldn't it be great to get most of LKMM's benefits without the learning curve?
This talk will describe how to do exactly that by using the standard Linux-kernel APIs (locking, reference counting, RCU) along with a simple rules of thumb, thus gaining most of LKMM's power with less learning. And the full LKMM is always there when you need it!
The Rise of Supernetwork Data Intensive ComputingLarry Smarr
Invited Remote Lecture to SC21
The International Conference for High Performance Computing, Networking, Storage, and Analysis
St. Louis, Missouri
November 18, 2021
Quantum Communications Q&A with Gemini LLM. These are based on Shannon's Noisy channel Theorem and offers how the classical theory applies to the quantum world.
Interaction Latency: Square's User-Centric Mobile Performance MetricScyllaDB
Mobile performance metrics often take inspiration from the backend world and measure resource usage (CPU usage, memory usage, etc) and workload durations (how long a piece of code takes to run).
However, mobile apps are used by humans and the app performance directly impacts their experience, so we should primarily track user-centric mobile performance metrics. Following the lead of tech giants, the mobile industry at large is now adopting the tracking of app launch time and smoothness (jank during motion).
At Square, our customers spend most of their time in the app long after it's launched, and they don't scroll much, so app launch time and smoothness aren't critical metrics. What should we track instead?
This talk will introduce you to Interaction Latency, a user-centric mobile performance metric inspired from the Web Vital metric Interaction to Next Paint"" (web.dev/inp). We'll go over why apps need to track this, how to properly implement its tracking (it's tricky!), how to aggregate this metric and what thresholds you should target.
What's Next Web Development Trends to Watch.pdfSeasiaInfotech2
Explore the latest advancements and upcoming innovations in web development with our guide to the trends shaping the future of digital experiences. Read our article today for more information.
In this follow-up session on knowledge and prompt engineering, we will explore structured prompting, chain of thought prompting, iterative prompting, prompt optimization, emotional language prompts, and the inclusion of user signals and industry-specific data to enhance LLM performance.
Join EIS Founder & CEO Seth Earley and special guest Nick Usborne, Copywriter, Trainer, and Speaker, as they delve into these methodologies to improve AI-driven knowledge processes for employees and customers alike.
GDG Cloud Southlake #34: Neatsun Ziv: Automating AppsecJames Anderson
The lecture titled "Automating AppSec" delves into the critical challenges associated with manual application security (AppSec) processes and outlines strategic approaches for incorporating automation to enhance efficiency, accuracy, and scalability. The lecture is structured to highlight the inherent difficulties in traditional AppSec practices, emphasizing the labor-intensive triage of issues, the complexity of identifying responsible owners for security flaws, and the challenges of implementing security checks within CI/CD pipelines. Furthermore, it provides actionable insights on automating these processes to not only mitigate these pains but also to enable a more proactive and scalable security posture within development cycles.
The Pains of Manual AppSec:
This section will explore the time-consuming and error-prone nature of manually triaging security issues, including the difficulty of prioritizing vulnerabilities based on their actual risk to the organization. It will also discuss the challenges in determining ownership for remediation tasks, a process often complicated by cross-functional teams and microservices architectures. Additionally, the inefficiencies of manual checks within CI/CD gates will be examined, highlighting how they can delay deployments and introduce security risks.
Automating CI/CD Gates:
Here, the focus shifts to the automation of security within the CI/CD pipelines. The lecture will cover methods to seamlessly integrate security tools that automatically scan for vulnerabilities as part of the build process, thereby ensuring that security is a core component of the development lifecycle. Strategies for configuring automated gates that can block or flag builds based on the severity of detected issues will be discussed, ensuring that only secure code progresses through the pipeline.
Triaging Issues with Automation:
This segment addresses how automation can be leveraged to intelligently triage and prioritize security issues. It will cover technologies and methodologies for automatically assessing the context and potential impact of vulnerabilities, facilitating quicker and more accurate decision-making. The use of automated alerting and reporting mechanisms to ensure the right stakeholders are informed in a timely manner will also be discussed.
Identifying Ownership Automatically:
Automating the process of identifying who owns the responsibility for fixing specific security issues is critical for efficient remediation. This part of the lecture will explore tools and practices for mapping vulnerabilities to code owners, leveraging version control and project management tools.
Three Tips to Scale the Shift Left Program:
Finally, the lecture will offer three practical tips for organizations looking to scale their Shift Left security programs. These will include recommendations on fostering a security culture within development teams, employing DevSecOps principles to integrate security throughout the development
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...Chris Swan
Have you noticed the OpenSSF Scorecard badges on the official Dart and Flutter repos? It's Google's way of showing that they care about security. Practices such as pinning dependencies, branch protection, required reviews, continuous integration tests etc. are measured to provide a score and accompanying badge.
You can do the same for your projects, and this presentation will show you how, with an emphasis on the unique challenges that come up when working with Dart and Flutter.
The session will provide a walkthrough of the steps involved in securing a first repository, and then what it takes to repeat that process across an organization with multiple repos. It will also look at the ongoing maintenance involved once scorecards have been implemented, and how aspects of that maintenance can be better automated to minimize toil.
AC Atlassian Coimbatore Session Slides( 22/06/2024)apoorva2579
This is the combined Sessions of ACE Atlassian Coimbatore event happened on 22nd June 2024
The session order is as follows:
1.AI and future of help desk by Rajesh Shanmugam
2. Harnessing the power of GenAI for your business by Siddharth
3. Fallacies of GenAI by Raju Kandaswamy
Video traffic on the Internet is constantly growing; networked multimedia applications consume a predominant share of the available Internet bandwidth. A major technical breakthrough and enabler in multimedia systems research and of industrial networked multimedia services certainly was the HTTP Adaptive Streaming (HAS) technique. This resulted in the standardization of MPEG Dynamic Adaptive Streaming over HTTP (MPEG-DASH) which, together with HTTP Live Streaming (HLS), is widely used for multimedia delivery in today’s networks. Existing challenges in multimedia systems research deal with the trade-off between (i) the ever-increasing content complexity, (ii) various requirements with respect to time (most importantly, latency), and (iii) quality of experience (QoE). Optimizing towards one aspect usually negatively impacts at least one of the other two aspects if not both. This situation sets the stage for our research work in the ATHENA Christian Doppler (CD) Laboratory (Adaptive Streaming over HTTP and Emerging Networked Multimedia Services; https://athena.itec.aau.at/), jointly funded by public sources and industry. In this talk, we will present selected novel approaches and research results of the first year of the ATHENA CD Lab’s operation. We will highlight HAS-related research on (i) multimedia content provisioning (machine learning for video encoding); (ii) multimedia content delivery (support of edge processing and virtualized network functions for video networking); (iii) multimedia content consumption and end-to-end aspects (player-triggered segment retransmissions to improve video playout quality); and (iv) novel QoE investigations (adaptive point cloud streaming). We will also put the work into the context of international multimedia systems research.
An invited talk given by Mark Billinghurst on Research Directions for Cross Reality Interfaces. This was given on July 2nd 2024 as part of the 2024 Summer School on Cross Reality in Hagenberg, Austria (July 1st - 7th)
Are you interested in dipping your toes in the cloud native observability waters, but as an engineer you are not sure where to get started with tracing problems through your microservices and application landscapes on Kubernetes? Then this is the session for you, where we take you on your first steps in an active open-source project that offers a buffet of languages, challenges, and opportunities for getting started with telemetry data.
The project is called openTelemetry, but before diving into the specifics, we’ll start with de-mystifying key concepts and terms such as observability, telemetry, instrumentation, cardinality, percentile to lay a foundation. After understanding the nuts and bolts of observability and distributed traces, we’ll explore the openTelemetry community; its Special Interest Groups (SIGs), repositories, and how to become not only an end-user, but possibly a contributor.We will wrap up with an overview of the components in this project, such as the Collector, the OpenTelemetry protocol (OTLP), its APIs, and its SDKs.
Attendees will leave with an understanding of key observability concepts, become grounded in distributed tracing terminology, be aware of the components of openTelemetry, and know how to take their first steps to an open-source contribution!
Key Takeaways: Open source, vendor neutral instrumentation is an exciting new reality as the industry standardizes on openTelemetry for observability. OpenTelemetry is on a mission to enable effective observability by making high-quality, portable telemetry ubiquitous. The world of observability and monitoring today has a steep learning curve and in order to achieve ubiquity, the project would benefit from growing our contributor community.
2. Actionable Requirements
The challenges we face
Defining product requirements
Assessing opportunities
Discovering solutions
Tools for opportunity assessment
The process of solution discovery and definition
The agile product innovation process
Q&A
6. Actionable Requirements
Agile teams are now developing software more quickly
than ever before. Unfortunately, this doesn’t mean they
are always aimed at building the right products.
-Mike Cohn
Author of Agile Estimating and Planning
21. Actionable Requirements
Some best practices
Have a Customer Advisory Board (CAB)
Need 2 for enterprise software
Plan 3-6 months in advance
Your team
Facilitator
Helper
Observer/Photographer
Try it on internal customers and teams first!
22. Actionable Requirements
Compile and consolidate learning
Revisit the planning onion
Write up the marketing requirements document (MRD)
Determine the minimal marketable feature-set (MMF)
Let’s look at an example
23. Actionable Requirements
A one-page opportunity assessment
1) Exactly what problem will this solve? (value proposition)
2) For whom do we solve the problem? (target market)
3) How will we measure success?
4) What alternatives are out there?
5) Why we’re best suited to do this?
6) Why now?
7) How will we deploy this?
8) What is the preliminary cost?
9) What factors are critical to success?
27. Actionable Requirements
Benefits of a high fidelity prototype
Provides a way to test out your ideas before spending
time and money to build them for real
Forces you to think about the problem at a much greater
level of detail
Gives developers a much greater detail product
requirement
30. Actionable Requirements
Conclusion
We can have a process for requirements gathering
Three phases to actionable requirements
People Tools/Artifacts
Opportunity Product Mkg MRD
Assessment Product Mgmnt Innovation Games
Architecture
Solution Discovery User Experience PRD
and Design Lead Engineer Prototype
Product Mgr HiFi Wireframe
Solution Execution Engineering/Test Fully functioning
Teams product
31. Actionable Requirements
Resources:
Innovation Games – Luke Hohmann
Ten Faces of Innovation – Tom Kelley
Inspired – Marty Cagan
Purple Cow – Seth Godin
Agile Estimation and Planning – Mike Cohn
Dilbert – Scott Adams