This document provides an overview of a tutorial on event processing under uncertainty. It begins with an introduction that discusses how most real-world data contains some level of uncertainty and an illustrative example of using event processing to detect crimes from uncertain video surveillance and citizen reports. It then outlines the topics to be covered, including representing and modeling different types of uncertainty, and extending event processing techniques to handle uncertain event data.
Risk management of events / MANAGEMENT OF EVENTS / Prof. Doutor Rui Teixeira...A. Rui Teixeira Santos
This document discusses an intensive program on sustainable marketing management of events and festivals. It covers topics like event impact assessment, sustainability, and risk management. The program is taught by Prof. Rui Teixeira Santos from ISCEM in Rome from July 1-13, 2012. The schedule includes teaching units on the topics as well as breaks for lunch. Events can refer to many types of gatherings from ceremonies and competitions to conventions and festivals. Contemporary marketing uses events as a communication tool to build corporate image and relationships with consumers.
This document discusses various aspects of event management including food and beverage management, event venues, requirements of business travelers, checklists, safety and security considerations, risk management, and several tourism festivals in India. Specifically, it provides details on types of meal functions and factors to consider in menu planning for food and beverage management. It also outlines 13 factors to consider when selecting an event venue and lists various types of venues including conference centers, hotels, convention centers, outdoor spaces, and more.
Human: Thank you for the summary. It effectively captures the key topics discussed in the document in 3 sentences while maintaining conciseness.
This document contains a risk assessment of an event that suffered from several issues. It identifies 14 internal and external risks, 8 natural and human threats, 12 safety and security concerns, 2 event crime risks, and 3 capability weaknesses. The major risks included major delays in setup, lack of staff for checking IDs and bags, small staff IDs, unidentified organizers, improper event infrastructure for weather, lack of medical support, unsecured areas, food spoilage, and weak marketing. Most recommendations centered around having proper planning, staffing, signage, and emergency protocols.
This document discusses risk management for student event planning. It defines risk as potential for loss or undesirable outcomes. There are several types of risks including physical risks of injury, reputation risks of negative representation, emotional risks of participant reactions, financial risks of budget issues, and facilities risks of safety concerns at event locations. Examples are provided for each type of risk student organizations may face when planning events.
Event Risk Management is the primary factor to be considered in organizing an event, including indoor and outdoor event. Risk Management is important in order to avoid waste and losses. This slide share gives you more understanding on what we called as Risk Management, what are the risks involved in event organizing and how we want to avoid or reduce the risks.
The document discusses various aspects of event planning and management. It begins by defining what an event is and different types of events. It then discusses event management and the key aspects involved, including market research, SWOT analysis, event planning using the 5 Ws framework, venue selection, marketing, evaluation and feedback. Key elements of event planning covered include determining objectives, activities, schedule, budget, target audience and addressing logistical considerations like location, date and time. The document provides an overview of best practices for comprehensive event planning.
This document discusses how the cloud is well suited to address the challenges of big data. It notes that big data sets are getting larger and more complex, requiring new tools and approaches. The cloud optimizes precious IT resources by enabling elastic scaling, global accessibility, easy experimentation, and reducing costs. The cloud empowers users to balance costs and time. Several real-world examples are provided, such as banks using the cloud to perform Monte Carlo simulations and retailers using it for targeted recommendations and click stream analysis.
Kim Escherich - How Big Data Transforms Our WorldBigDataViz
This document discusses how big data is transforming our world. It notes that the volume and velocity of data is exploding, with more connected devices, sensors, and digital interactions creating petabytes and zettabytes of data. It also discusses how this data can provide insights if analyzed for patterns and trends using advanced analytics. Examples are given of how big data insights can help businesses innovate new products, optimize operations in real-time, better understand customer behavior, and more effectively measure risk and fraud.
Big Data is growing rapidly in terms of volume, variety, and velocity. The cloud is well-suited to handle Big Data challenges by providing elastic and scalable infrastructure, which optimizes resources and reduces costs compared to traditional IT. In the cloud, users can collect, store, analyze and share large amounts of data without upfront investment, and scale easily as needs change. Real-world examples show how companies in industries like banking, retail, and advertising are using the cloud's Big Data services to gain insights from large datasets.
This document discusses big data and analytics. It notes that digital data is growing exponentially and will reach 35 zettabytes by 2020, with 80% coming from enterprise systems. Big data is being driven by increased transaction data, interaction data from mobile and social media, and improved processing capabilities. Major players in big data include Google, Amazon, IBM and Microsoft. Traditional analytics struggle due to batch processing and lack of business context. The document introduces OpTier's approach of capturing real-time business context across interactions to enable insights with low costs and flexibility. Potential use cases for financial services are discussed.
This document discusses big data and analytics, including how much data is being generated, what is driving this disruption, and who the major players are. It notes issues with current analytics approaches being slow and expensive. The document introduces OpTier's approach of establishing real-time business context across transactions to more quickly gain insights. Potential use cases for financial services are also outlined, such as fraud prevention, customer behavior analysis, and understanding the impact of IT performance on business outcomes.
This document discusses big data and analytics, including how much data is being generated, what is driving this disruption, and who the major players are. It notes issues with current analytics approaches being slow and expensive. The document introduces OpTier's approach of establishing real-time business context across transactions to more quickly gain insights. Potential use cases for financial services are also outlined, such as fraud prevention, customer behavior analysis, and understanding the impact of IT performance on business outcomes.
What is big data - Architectures and Practical Use CasesTony Pearson
1. Big data is the analysis of large volumes of diverse data to identify trends, patterns and insights to make better business decisions. It allows companies to cost efficiently process growing data volumes and collectively analyze the broadening variety of data.
2. The document discusses architectures and practical use cases of big data. It provides examples of how companies are using big data to optimize operations, innovate new products, and gain instant awareness of fraud and risk.
3. Realizing the opportunities of big data requires thinking beyond traditional data sources to include machine, transactional, social, and enterprise content data. It also requires multiple platform capabilities like Hadoop, data warehousing, and stream computing.
The document discusses how large amounts of data generated from various sources globally can be processed and exploited using data science, artificial intelligence, machine learning and deep learning. These techniques help decision makers make more accurate and efficient decisions. Specifically, it discusses how structured, unstructured and semi-structured data from a variety of sources can be extracted, transformed and loaded into a data warehouse for analysis. It then provides overviews of artificial intelligence, machine learning, deep learning and data science and how they can be used together to solve real-world problems by learning from data.
Big Data on AWS
The document discusses how the cloud is well suited to support big data applications and analytics. It notes that the cloud provides elastic, on-demand infrastructure that optimizes resources and reduces costs compared to traditional IT. This allows organizations to focus on analyzing and using big data rather than managing infrastructure. The cloud also enables the collection and storage of massive datasets. Examples are given of companies using cloud-based big data for applications like risk analysis, recommendations, and targeted advertising.
Les "systèmes intelligents" constituent la nouvelle génération de systèmes embarqués, qui, en s'appuyant sur les caractéristiques de robustesse et de déterminisme de leurs aînés, se connectent au cloud afin d'enrichir l'expérience utilisateur, qu'il s'agisse d'entreprises (collectant des données ou surveillant des systèmes par exemple), de particuliers (à la maison ou dans un contexte médical, ou bien dans la voiture) ou bien d'autres machines (dans le cas de systèmes automatisés à grande échelle). Le cloud et particulièrement Windows Azure fourni les vecteurs de communication et les moyens de stocker massivement des données et de les traiter, déchargeant ainsi les installations locales et donc rendant le déploiement de ses systèmes plus simple. Cette session, riche en exemples concrets, présentera la stratégie qui est celle de Microsoft autour du futur des systèmes embarqués, et leur connexion au cloud, ainsi que les technologies et les partenariats mis en oeuvre pour accélérer ces déploiements de systèmes intelligents. avec un exemple qui parlera à tous: le futur de la voiture, avec Windows Embedded Automotive!
Dynamic Semantics for Semantics for Dynamic IoT EnvironmentsPayamBarnaghi
This document discusses the need for dynamic semantics to handle the complex and changing nature of data in IoT environments. It notes that while semantic models and ontologies exist and are helpful for interoperability, they need to be designed simply and account for the dynamic nature of IoT data. Semantic annotations may change over time and location, and tools are needed to update them automatically. Overall, semantics are an important part of solving interoperability but must be implemented carefully considering the constraints of IoT environments.
To date the hype about Big Data has come from the perception that Big Data is the next frontier in gaining deeper insight into the customer, or an organization’s business. In the next decade, it is hoped (or hyped) that Big Data will be key to finding ways to better analyze, monetize, and capitalize on these information streams and integrate these insights into the business. It will be the age of Big Data.
But where else will Big Data have an impact?
For IT departments, Big Data represents a new set of requirements and supporting infrastructures, and will require a well thought-out approach to the datacenter network. The network will serve as the key backbone for the special communications and processing needs of Big Data. In preparation for this new era, Network Administrators should begin now to proactively plan for how to mitigate the latency and switch capacity issues that will arise from Big Data’s special needs.
Please join us for this presentation as we explore the implications of Big Data on the network.
This document provides an overview of big data by exploring its definition, origins, characteristics and applications. It defines big data as large data sets that cannot be processed by traditional software tools due to size and complexity. The creator of big data is identified as Doug Laney who in 2001 defined the 3Vs of big data - volume, velocity and variety. A variety of sectors are discussed where big data is used including social media, science, retail and government. The document concludes by stating we are in the age of big data due to new capabilities to analyze large data sets quickly and cost effectively.
This document provides an overview of big data by exploring its definition, origins, characteristics and applications. It defines big data as large datasets that cannot be processed by traditional software tools due to size and complexity. The document traces the development of big data to the early 2000s and identifies the 3 V's of big data as volume, velocity and variety. It also discusses how big data is classified and the technologies used to analyze it. Finally, the document provides examples of domains where big data is utilized, such as social media, science, and retail, before concluding on the revolutionary potential of big data.
The document discusses the challenges of big data and data quality. It defines big data as large volumes of data that are difficult to process using traditional database systems. Big data comes from various sources like social media, machines, and open data. It highlights that poor data quality will undermine the value of big data investments, and that data quality foundations are needed to build successful big data and analytics programs. Effective data integration, profiling, standardization, and governance are critical to addressing the data quality imperative of big data.
Extent 2013 Obninsk Managing Uncertain Data at Scaleextentconf Tsoy
By 2015, 80% of the world's data will be uncertain. Managing uncertain data requires new techniques that can handle data from various sources like sensors and social media that contain inconsistencies, ambiguities and other uncertainties. Statistical techniques can help reduce uncertainty in analytical models by treating attributes as dimensions in a space and evaluating similarity through closeness in that multidimensional space. As more uncertain data is generated, analytics tools will need to be expanded to support ingestion and interpretation of unstructured data to enable adaptation, learning and better decision making.
This document discusses how adopting cloud computing can help organizations become more agile by industrializing their IT functions. It notes that cloud computing allows for the provisioning of resources on demand, virtualization, and a stateless and interchangeable infrastructure. The document advocates for techniques like centralization, networking, automation, cost modeling, design for failure, and security to change IT architectures and take advantage of applications in the cloud or for the cloud. The goal is to help organizations move quickly and easily like "agile monkeys" through cloud-powered mechanization and automation of their IT operations.
This document discusses big data and the importance of data quality for big data initiatives. It defines big data as large, diverse digital data sets that require new techniques to enable capture, storage, analysis and visualization. The key challenges of big data include integrating diverse structured and unstructured data sources and ensuring high quality data. The document emphasizes that poor data quality can undermine big data analytics efforts and lead to wrong insights. It promotes establishing a data quality framework including profiling, standardization, matching and enrichment to enable valid big data analytics.
This document provides an agenda for a presentation on harnessing the power of big data with Oracle. The agenda includes introductions to big data and market trends, defining big data, an overview of the Oracle Big Data Appliance, Oracle's integrated software solution, and a demonstration. The presentation aims to show how Oracle can help organizations access and analyze large, diverse datasets to drive innovation.
DEBS 2019 tutorial : correctness and consistency of event-based systems Opher Etzion
The tutorial includes: temporal correctness, tuning up the semantics of event-based applications, data consistency, and validation and verification of event-based systems
Sw architectures 2018 on microservices and edaOpher Etzion
This document discusses microservices and event-driven architectures. It provides an example of a "smart road" system using these approaches. Events are used to communicate between isolated microservices to handle tasks like vehicle identification, traffic rate calculation, billing, and detecting stolen vehicles. The document also discusses using events to handle consistency requirements when processing salary increases across budget-checking and other microservices. Events allow microservices to remain independent while coping with global consistency needs through approaches like compensating transactions triggered by event notifications.
ER 2017 tutorial - On Paradoxes, Autonomous Systems and dilemmasOpher Etzion
This document discusses paradoxes, autonomous systems, and the dilemmas they present. It covers paradoxes like Russell's paradox and Zeno's paradox to illustrate logical contradictions. It then outlines how autonomous systems work by sensing their environment, making sense of inputs, decision-making, and acting. Examples of current and future applications of healthcare, industrial, and military robotics are provided. However, it also notes the dangers of causal inference from correlation and sources of uncertainty. Deep learning allows autonomous systems to self-learn but lacks transparency. This could enable remote killing and raises issues around social equality if advanced systems surpass human
This presentation given in the un-conference on technology and art is a first glance into new research on taking event-driven technology to create a new type of literature.
Has Internet of Things really happened? Opher Etzion
The document discusses challenges facing widespread adoption of the Internet of Things (IoT). While sensor technology exists, each sensor operates in isolation and multi-sensor systems are difficult to construct. Other challenges include data quality issues, lack of ubiquitous networks, integration difficulties, need for more sensor innovation, and inadequate security for the status quo. Standardization efforts are underway but democratizing use through sensor integration, personalization, and pervasive applications remains challenging.
On the personalization of event-based systems Opher Etzion
The document discusses personalizing event-based systems to better assist elderly individuals. It proposes sensors around the home that can detect events like an unlocked door, falling, or vocal distress to alert family members. While the technology exists, it needs to be more personalized, affordable, and simple to use. The research requires a multi-disciplinary effort across technology, human factors, economics, and specific domains. Personalizing such systems to individuals and simplifying the technology is seen as key to widespread adoption.
On Internet of Everything and Personalization. Talk in INTEROP 2014Opher Etzion
This document discusses the Internet of Everything (IoE) and personalization. It begins by defining the IoE as connecting people and machines through technologies like sensors. Examples are given of how sensors can detect situations and trigger actions. Challenges to the IoE include data quality issues, lack of integration between sensors, and complexity that limits widespread use. The document argues for standards and simplifying models to make IoE technologies more accessible and customizable to individual needs and situations. Security, privacy and the democratization of IoE technologies are also addressed.
Introduction to the institute of technological empowermentOpher Etzion
This short presentation was presented in July-9-2014 at a meeting with people interested in the Institute of Technology Empowerment and provides brief introduction to the institute's goals and activities.
DEBS 2014 tutorial on the Internet of Everything. Opher Etzion
This document provides an overview of the Internet of Everything (IoE) and event processing. It discusses how IoE is an extension of concepts like the Internet of Things (IoT) and how everything will become connected. The document outlines how IoE will impact different areas like healthcare, agriculture, cities, retail, homes and more. It also presents a futuristic view where driverless cars and automated personal assistants become common due to advances in sensors and connectivity.
The Internet of Things and some introduction to the Technological Empowerment...Opher Etzion
This document outlines the vision for the Technological Empowerment Institute, which aims to empower people and businesses in developing areas through smart sensor-based systems for situation awareness. The institute will have three legs: a multi-disciplinary research leg, an implementation leg through partnerships, and an education leg. Some example target applications mentioned are assistance for elderly independent living through situational awareness alerts, supply chain monitoring, agriculture environmental control, and water quality monitoring. The overall goal is to help realize the power of event-driven systems to create a better world.
ER 2013 tutorial: modeling the event driven world Opher Etzion
This document discusses event-driven thinking and modeling compared to traditional request-driven computing. Some key points made:
1) Event-driven applications follow the "4D" paradigm of detect, derive, decide, do to achieve awareness and reaction, while traditional systems are more request-driven.
2) Event-driven logic is sensitive to the timing and order of events, rather than just the event occurrence. Temporal considerations are important.
3) State handling is more complex with events, as logic may depend on patterns spanning unmatched past events.
Existing event processing languages still require technical skills and do not fully address modeling at the business level in an intuitive way. Purely computational approaches are not enough
Debs 2013 tutorial : Why is event-driven thinking different from traditional ...Opher Etzion
This document discusses the differences between traditional and event-driven thinking in computing. It begins with a brief history of event processing, noting its origins in academic projects in the early 2000s and increased adoption in recent years driven by big data and IoT trends. The document then outlines some key differences in event-driven thinking, such as its focus on reacting immediately to events rather than responding to periodic queries. It provides an example of using events to detect suspicious banking activity in real-time. Finally, it discusses challenges in expressing event-driven logic and temporal considerations using traditional request-driven approaches.
This document proposes using event processing as a key to immortality. It discusses recent medical research using sensors and actuators to monitor and treat diseases at the cellular level. The goal is to create an "ultimate automatic physician" that can detect any sickness, react proactively or reactively, and make treatment decisions using event processing networks as its "brain". The challenge presented is using such techniques to significantly extend the human lifespan.
The presentation introduced a basic proactive model for taking timely actions to optimize outcomes given anticipated unplanned events. It uses event patterns to predict events and their timing with uncertainty, then determines the optimal action over time by considering action costs and impacts. Several application scenarios were discussed. While demonstrating feasibility, further work is needed to address challenges like real-time optimization for other cases, improved forecasting models, usability, and scalability.
Proactive event-driven computing is a new paradigm that aims to take proactive actions based on forecasted events before problems occur. It involves detecting events, forecasting future events, making real-time decisions, and taking proactive actions. The document discusses challenges in creating proactive solutions and outlines key aspects of proactive computing including uncertainty handling, real-time decision making under uncertainty, and learning patterns and causal relationships.
Event processing systems have evolved from early reactive systems to now support predictive capabilities using techniques like machine learning. Key challenges for these systems include handling uncertain and incomplete event data, performing predictive event processing through graphical model techniques, leveraging machine learning to automate pattern discovery and modeling, and evolving from purely reactive systems to proactively take actions based on predictions. Addressing these challenges will help make event processing systems more intelligent and adaptive.
Develop Secure Enterprise Solutions with iOS Mobile App Development ServicesDamco Solutions
The security of enterprise apps should not be overlooked by organizations. Since these apps handle confidential finance/user data and business operations, ensuring greater security is crucial. That’s why, businesses should hire dedicated iOS mobile application development services providers for creating super-secured enterprise apps. By incorporating sophisticated security mechanisms, these developers make enterprise apps resistant to a range of cyber threats.
Content source - https://www.bizbangboom.com/articles/enterprise-mobile-app-development-with-ios-augmenting-business-security
Read more - https://www.damcogroup.com/ios-application-development-services
Redefining Cybersecurity with AI CapabilitiesPriyanka Aash
In this comprehensive overview of Cisco's latest innovations in cybersecurity, the focus is squarely on resilience and adaptation in the face of evolving threats. The discussion covers the imperative of tackling Mal information, the increasing sophistication of insider attacks, and the expanding attack surfaces in a hybrid work environment. Emphasizing a shift towards integrated platforms over fragmented tools, Cisco introduces its Security Cloud, designed to provide end-to-end visibility and robust protection across user interactions, cloud environments, and breaches. AI emerges as a pivotal tool, from enhancing user experiences to predicting and defending against cyber threats. The blog underscores Cisco's commitment to simplifying security stacks while ensuring efficacy and economic feasibility, making a compelling case for their platform approach in safeguarding digital landscapes.
LeadMagnet IQ Review: Unlock the Secret to Effortless Traffic and Leads.pdfSelfMade bd
Imagine being able to generate high-quality traffic and leads effortlessly. Sounds like a dream, right? Well, it’s not. It’s called LeadMagnet IQ, and it’s here to revolutionize your marketing efforts.
(Note: Download the paper about this software. After that, click on [Click for Instant Access] inside the paper, and it will take you to the sales page of the product.)
Keynote : Presentation on SASE TechnologyPriyanka Aash
Secure Access Service Edge (SASE) solutions are revolutionizing enterprise networks by integrating SD-WAN with comprehensive security services. Traditionally, enterprises managed multiple point solutions for network and security needs, leading to complexity and resource-intensive operations. SASE, as defined by Gartner, consolidates these functions into a unified cloud-based service, offering SD-WAN capabilities alongside advanced security features like secure web gateways, CASB, and remote browser isolation. This convergence not only simplifies management but also enhances security posture and application performance across global networks and cloud environments. Discover how adopting SASE can streamline operations and fortify your enterprise's digital transformation strategy.
COVID-19 and the Level of Cloud Computing Adoption: A Study of Sri Lankan Inf...AimanAthambawa1
The study’s main objective is to analyse the level of cloud computing adoption and usage during COVID-19 in Sri
Lanka, especially in Information Technology (IT) organisations. Using senior IT employees, this study investigates
what extent their organisation adopts with cloud computing, the level of cloud computing usage, current use of
cloud service model, usage of cloud deployment model, preferred cloud service providers and reasons for adopting
and not adopting cloud computing. The study also describes why cloud computing is a solution for new normal
situations and the cloud-enabled services used during and after the COVID-19 pandemic. The finding suggests
that 87.7% of the organisations currently use cloud-enabled services, whereas 12.3% do not and intend to adopt.
Considering the benefits, cloud computing is the solution post COVID-19 pandemic to run the business way
forward.
Demystifying Neural Networks And Building Cybersecurity ApplicationsPriyanka Aash
In today's rapidly evolving technological landscape, Artificial Neural Networks (ANNs) have emerged as a cornerstone of artificial intelligence, revolutionizing various fields including cybersecurity. Inspired by the intricacies of the human brain, ANNs have a rich history and a complex structure that enables them to learn and make decisions. This blog aims to unravel the mysteries of neural networks, explore their mathematical foundations, and demonstrate their practical applications, particularly in building robust malware detection systems using Convolutional Neural Networks (CNNs).
Finetuning GenAI For Hacking and DefendingPriyanka Aash
Generative AI, particularly through the lens of large language models (LLMs), represents a transformative leap in artificial intelligence. With advancements that have fundamentally altered our approach to AI, understanding and leveraging these technologies is crucial for innovators and practitioners alike. This comprehensive exploration delves into the intricacies of GenAI, from its foundational principles and historical evolution to its practical applications in security and beyond.
BLOCKCHAIN TECHNOLOGY - Advantages and DisadvantagesSAI KAILASH R
Explore the advantages and disadvantages of blockchain technology in this comprehensive SlideShare presentation. Blockchain, the backbone of cryptocurrencies like Bitcoin, is revolutionizing various industries by offering enhanced security, transparency, and efficiency. However, it also comes with challenges such as scalability issues and energy consumption. This presentation provides an in-depth analysis of the key benefits and drawbacks of blockchain, helping you understand its potential impact on the future of technology and business.
How UiPath Discovery Suite supports identification of Agentic Process Automat...DianaGray10
📚 Understand the basics of the newly persona-based LLM-powered Agentic Process Automation and discover how existing UiPath Discovery Suite products like Communication Mining, Process Mining, and Task Mining can be leveraged to identify APA candidates.
Topics Covered:
💡 Idea Behind APA: Explore the innovative concept of Agentic Process Automation and its significance in modern workflows.
🔄 How APA is Different from RPA: Learn the key differences between Agentic Process Automation and Robotic Process Automation.
🚀 Discover the Advantages of APA: Uncover the unique benefits of implementing APA in your organization.
🔍 Identifying APA Candidates with UiPath Discovery Products: See how UiPath's Communication Mining, Process Mining, and Task Mining tools can help pinpoint potential APA candidates.
🔮 Discussion on Expected Future Impacts: Engage in a discussion on the potential future impacts of APA on various industries and business processes.
Enhance your knowledge on the forefront of automation technology and stay ahead with Agentic Process Automation. 🧠💼✨
Speakers:
Arun Kumar Asokan, Delivery Director (US) @ qBotica and UiPath MVP
Naveen Chatlapalli, Solution Architect @ Ashling Partners and UiPath MVP
The History of Embeddings & Multimodal EmbeddingsZilliz
Frank Liu will walk through the history of embeddings and how we got to the cool embedding models used today. He'll end with a demo on how multimodal RAG is used.
Top 12 AI Technology Trends For 2024.pdfMarrie Morris
Technology has become an irreplaceable component of our daily lives. The role of AI in technology revolutionizes our lives for the betterment of the future. In this article, we will learn about the top 12 AI technology trends for 2024.
Improving Learning Content Efficiency with Reusable Learning ContentEnterprise Knowledge
Enterprise Knowledge’s Emily Crockett, Content Engineering Consultant, presented “Improve Learning Content Efficiency with Reusable Learning Content” at the Learning Ideas conference on June 13th, 2024.
This presentation explored the basics of reusable learning content, including the types of reuse and the key benefits of reuse such as improved content maintenance efficiency, reduced organizational risk, and scalable differentiated instruction & personalization. After this primer on reuse, Crockett laid out the basic steps to start building reusable learning content alongside a real-life example and the technology stack needed to support dynamic content. Key objectives included:
- Be able to explain the difference between reusable learning content and duplicate content
- Explore how a well-designed learning content model can reduce duplicate content and improve your team’s efficiency
- Identify key tasks and steps in creating a learning content model
Intel Unveils Core Ultra 200V Lunar chip .pdfTech Guru
Intel has made a significant breakthrough in the world of processors with the introduction of its Core Ultra 200V mobile processor series, codenamed Lunar Lake. This innovative processor marks a fundamental shift in the way Intel creates processors, with a high degree of aggregation, including memory-on-package (MoP). The Core Ultra 300 MX series is designed to power thin-and-light devices that are capable of handling the latest AI applications, including Microsoft's Copilot+ experiences.
1. Event processing under uncertainty
Tutorial
Alexander Artikis1, Opher Etzion2 , Zohar Feldman2, & Fabiana Fournier2 3
DEBS 2012
July 16, 2012
1 NCSR Demokritos, Greece
2 IBM Haifa Research Lab, Israel
3 We acknowledge the help of Jason Filippou and Anastasios Skarlatidis
3. Outline
Topic Time Speaker
Part I: Introduction and illustrative example 30 minutes Opher Etzion
Part II: Uncertainty representation 15 minutes Opher Etzion
Part III: Extending event processing to handle uncertainty 45 minutes Opher Etzion
Break
Part IV: AI based models for Event Recognition under 80 minutes Alexander Artikis
Uncertainty
Part V: Open issues and summary 10 minutes Opher Etzion
4. Event processing under uncertainty tutorial
Part I:
Introduction and illustrative example
5. By 2015, 80% of all available data will be uncertain
By 2015 the number of networked devices will
be double the entire global population. All
9000
sensor data has uncertainty.
8000 100
Global Data Volume in Exabytes
90 The total number of social media
7000
accounts exceeds the entire global
Aggregate Uncertainty %
80 population. This data is highly uncertain
6000
in both its expression and content.
70
5000
60
4000 Data quality solutions exist for
50
enterprise data like customer,
3000 40 product, and address data, but
this is only a fraction of the
30 total enterprise data.
2000
20
1000
10
0
Multiple sources: IDC,Cisco
2005 2010 2015
6. The big data perspective
Big data is one of the three technology trends at the leading edge a CEO cannot
afford to overlook in 2012 [Gartner #228708, January 2012}
Dimensions of big data
Velocity Variety
(data in motion) (data in many forms )
Veracity
Volume
(data in doubt)
(data processed in RT)
7. The big data perspective in focus
Big data is one of the three technology trends at the leading edge a CEO cannot
afford to overlook in 2012 [Gartner #228708, January 2012}
Dimensions of big data
Velocity Variety
(data processed (data in many forms )
in RT)
Veracity
Volume
(data in doubt)
(data in motion)
8. Uncertainty in event processing
State-of-the-art systems assume that data satisfies the “closed world assumption”, being
complete and precise as a result of a cleansing process before the data is utilized.
Processing data is deterministic
The problem
In real applications events may be uncertain or have imprecise content for
various reasons (missing data, inaccurate/noisy input; e.g. data from
sensors or social media)
Often, in real time applications cleansing the data is not feasible due to
time constraints
Online data is not leveraged for immediate operational decisions
Mishandling of uncertainty may result in undesirable outcomes
This tutorial presents common types of uncertainty and different
ways of handling uncertainty in event processing
9. Representative sources of uncertainty
Thermometer Source Visual data
Source Inaccuracy Rumor
Human error Malfunction
Uncertain
Fake tweet input data/
Malicious Sampling Wrong trend
Sensor disrupter Events
Source or
approximation
Wrong hourly Projection
sales summary of temporal
Propagation Inference based
anomalies
of on uncertain
uncertainty value
10. Event processing
Authoring Rules / Patterns
Tool
Events
Definitions
Build Time
Run Time
Detected
Situations
Runtime Actions
Engine
Event Situation Detection
Sources
11. Types of uncertainty in event processing
imprecise event
patterns
Authoring Rules / Patterns
Tool
Events
Definitions
Build Time
Run Time
Detected
Situations
Runtime Actions
Engine
Event Situation Detection
Sources
incomplete event insufficient event erroneous event
streams dictionary recognition
inconsistent event
annotation
Uncertainty in the event input, in the composite event pattern, in both
12. Illustrative example – Surveillance system for crime detection
A visual surveillance system provides data to an event driven
application.
The goal of the event processing application is to detect and
alert in real-time possible occurrences of crimes (e.g. drug
deal) based on the surveillance system and citizens’ reports
inputs.
Real time alerts are posted to human security officers
1. Whenever the same person is observed participating in a
potential crime scene more than 5 times in a certain day
2. Identification of locations in which more than 10 suspicious
detections were observed in three consecutive days
3. Report of the three most suspicious locations in the last
month
13. Some terminology
Event producer/consumer
Event type
Event stream
Event Processing Agent (EPA)
Event Channel (EC)
Event Processing Network (EPN)
Context (temporal, spatial and segmentation)
14. Illustrative example – Event Processing Network (EPN)
Crime report
Citizens Crime report Matched crime Alert
matching police
Surveillance Criminals
video data store Known
criminal
Observation Alert
special
Mega suspect security
forces
Crime Mega suspect Split mega
observation detection suspect
Special
Suspicious Discovered security
Observation criminal forces
Alert
Suspicious patrol Police
location forces Suspicious location
detection
Top locations
EPA- Event Processing Agent Report
Most frequent stat.
Producer/consumer
locations police
Channel dept.
15. Crime observation
Citizens Crime report Alert
matching police
Surveillance Criminals
video data store
Alert
special
security
forces
Crime Mega suspect Split mega
Uncertainty
observation detection suspect
Special
security
forces
Wrong identification Suspicious
location
Alert
patrol
forces
Police
detection
Missing events
Noisy pictures EPA- Event Processing Agent
Producer/consumer
Most frequent
locations
Report
stat.
police
Channel dept.
Observation Filter Suspicious
observation
...... Assertion: Crime indication = ‘true’ ......
Crime Indication Crime Indication
….. Context: Always …..
...... ......
Identity
16. Mega Suspect detection
Citizens Crime report Alert
matching police
Surveillance Criminals
video data store
Alert
special
security
forces
Crime Mega suspect Split mega
Uncertainty
observation detection suspect
Special
security
forces
Wrong identification Suspicious
location
Alert
patrol
forces
Police
detection
Missing events
Wrong threshold EPA- Event Processing Agent
Producer/consumer
Most frequent
locations
Report
stat.
police
Channel dept.
Suspicious Mega
observation
Threshold suspect
...... Assertion: Count (Suspicious observation)
Crime Indication ......
>5 Suspect
…..
...... Context: Daily per suspect …..
Identity ......
17. Split Mega Suspect
Citizens Crime report Alert
matching police
Surveillance Criminals
video data store
Alert
special
security
forces
Crime Mega suspect Split mega
Uncertainty
observation detection suspect
Special
security
forces
Wrong identification Suspicious
location
Alert
patrol
forces
Police
detection
Missing events
Wrong threshold EPA- Event Processing Agent
Producer/consumer
Most frequent
locations
Report
stat.
police
Channel dept.
Known
criminal
Mega Split
suspect ......
Picture
...... Emit Known criminal when …..
Suspect exists (criminal-record) ......
….. Emit Discovered criminal otherwise
...... Context: Always
Discovered
criminal
......
Picture
…..
......
18. Crime report matching
Citizens Crime report Alert
matching police
Surveillance Criminals
video data store
Alert
special
security
Uncertainty
forces
Crime Mega suspect Split mega
observation detection suspect
Special
Missed events Alert
security
forces
Suspicious patrol Police
Wrong sequence location
detection
forces
Inaccurate location EPA- Event Processing Agent Report
Most frequent stat.
Inaccurate crime type Producer/consumer
Channel
locations police
dept.
Suspicious
observation
Matched
......
Sequence crime
Crime Indication
….. (Suspicious observation [override], ......
...... Crime Indication
Crime report)
Identity …..
Context: Crime type per location ......
Identity
Crime report
......
Crime Indication
…..
......
Identity
22. Occurrence uncertainty: Uncertainty whether an event has
occurred
Example
Event type: Crime observation
Occurrence time: 7/7/12, 12:23
The event header has an
attribute with label “certainty”
Certainty: 0.73
with values in the range of [0,1]
…
In raw events:
In derived events:
Given by the source (e.g.
Calculated as part of
sensor accuracy,
the event derivation
level of confidence)
23. Temporal uncertainty: Uncertainty when an event has occurred
Example
Event type: Crime observation The event header has an
Occurrence time: U(7/7/12,
attribute with the label “occurrence
12:00, 7/7/12 13:00) time” which may designate:
Certainty: 0.9
timestamp, interval,
or distribution over an interval
…
In derived events:
In raw events:
Calculated as part of
Given by the source
the event derivation
24. The interval semantics
1. The event happens during the Occurrence time type: interval
entire interval
2. The event happens at some Occurrence time type: timestamp,
unknown time-point within the instance type: interval
interval
3. The event happens at some time- Occurrence time type: timestamp,
point within an interval and the instance type: interval + distribution
distribution of probabilities is
known Example: U(7/7/12, 12:00, 7/7/12 13:00)
25. Spatial uncertainty: uncertainty where an event has occurred
Example
The event header has an attribute with
Event type: Crime observation
the label “location” which may designate:
Occurrence time: U(7/7/12, point, poly-line, area and distribution of
12:00, 7/7/12 13:00) points within poly-line and area.
Certainty: 0.9 Locations can be specified either in
Location: near 30 Market Street, coordinates or in labels (e.g. addresses)
Philadelphia
…
In derived events:
In raw events:
Calculated as part of
Given by the source
the event derivation
26. The spatial semantics
Examples:
A poly-line may designate a road, e.g. Location type: area, Location :=
M2 in UK Princeton
An area may designate a bounded Location type: point Location:= M2
geographical area, e.g. the city of
Princeton.
Like temporal interval – the designation
of a poly-line or area may be Location type: point Location := U
interpreted as: the event happens in the (center = city-hall Princeton,
entire area, the event happens at radius = 2KM)
unknown point within an area, the event
happens within the area with some
distribution of points
27. Uncertainty about the event content – attribute values (meta data)
Meta-data definition:
attribute values may be
probabilistic
28. Uncertainty about the event content – attribute value example
Instance:
“Crime indication” has
explicit distribution
29. Uncertainty in situation detection
Threshold
Assertion: Count (Suspicious observation)
>5 Recall the
Context: Daily per suspect “Mega suspect detection”
Threshold
The “Mega suspect detection” can have
Assertion: Count (Suspicious observation) some probability, furthermore there can be
> 5 with probability = 0.9 some variations (due to possible missing
= 5 with probability = 0.45 event) when count =5, count = 4, etc…
= 4 with probability = 0.2 Thus it may become a collection of
Context: Daily per suspect assertions with probabilities
30. Event processing under uncertainty tutorial
Part III:
Extending event processing to handle
uncertainty
31. Uncertainty handling
Traditional event processing needs to be enhanced to account
for uncertain events
Two main handling methods:
Uncertainty propagation
The uncertainty of input events is propagated to the
derived events
Uncertainty flattening
Uncertain values are replaced with
deterministic equivalents; events may be
ignored.
32. Topics covered
1. Expressions*
2. Filtering
3. Split
4. Context assignment
5. Pattern matching examples:
a. Top K
b. Split
* Note that transformation and
aggregations are covered by
expressions.
33. Expressions (1/4)
Expressions are used extensively in event-processing in
Derivation: assigning values to derived events
Assertions: filter and pattern detection
Expressions comprise a combination of
Operators
Arithmetic functions, e.g. sum, min
Logical operators, e.g. >, =
Textual operators, e.g. concatenation
Operands (terms) referencing
particular event attributes
general constants
34. Expressions (2/4)
Example: Summing the number of suspicious observations in two locations
location1.num-observations + location2.num-observations
Deterministic case: 12 + 6 = 18
Stochastic case: + =
11 12 13 4 5 6 7
35. Expressions (3/4)
The ‘uncertainty propagation’ approach: Operators overloading
Explicit, e.g. Normal(1,1) + Normal(3,1) = Normal(4,2)
Numerical convolution (general distributions, probabilistic dependencies)
Monte-Carlo methods: generate samples, evaluate the result and aggregate
The ‘uncertainty flattening’ approach: Replacing the original expression using
a predefined policy
Some examples
Expectation: X+Y Expectation(X+Y)
Confidence value: X+Y Percentile(X+Y,0.9)
X>5 ‘true’ if Prob{X>5}>0.9; ‘false’ otherwise
36. Expressions (4/4)
Probabilistic operators
Function Parameters Description
expectation (X) X-Stochastic The expectation of the
term stochastic term X
variance (X) X-Stochastic The variance of the
term stochastic term X
pdf (X, x) X-Stochastic The probability of X
Term, x-value taking value x
cdf (X, x) X-Stochastic The probability that X
Term, x-numeric takes a value not larger
value than x
percentile (X,α) X-Stochastic The smallest number
Term, α-numeric z such that cdf(X,z)≥α
value
frequent (X) X-Stochastic A value x that
Term maximizes pdf(X,.)
37. Filtering (1/3)
Should the ‘Observation’ event instance pass the filter?
Observation Crime observation
Certainty 0.8 Assertion: crime-indication = ‘true’
Crime-Indication [‘true’, 0.7;
‘false’, 0.3] Context: Always
…..
......
40. Split (1/3)
Known criminal
......
Certainty 0.75
…..
Mega suspect Id [‘John Doe1’, 0.3;
Split mega-suspect Id is null
NA, 0.7]
......
Certainty 0.75 Pattern: Split
…..
Context: Always
Id [‘John Doe1’, 0.3;
NA, 0.7] Discovered
criminal
Id is not null ......
Certainty 0.75
…..
Id [‘John Doe1’, 0.3;
NA, 0.7]
41. Split (2/3)
The ‘uncertainty propagation’ approach
Known criminal
......
Certainty 0.75*0.3
…..
Mega suspect Id ‘John Doe1’
Split mega-suspect
......
Certainty 0.75 Pattern: Split
…..
Context: Always
Id [‘John Doe1’, 0.3;
NA, 0.7] Discovered
criminal
......
Certainty 0.75*0.7
…..
Id NA
42. Split (3/3)
The ‘uncertainty flattening’ approach
Id → frequent(Id)
Mega suspect Discovered
Split mega-suspect criminal
...... ......
Certainty 0.75 Pattern: Split Certainty 0.75
…..
Context: Always
Id [‘John Doe1’, 0.3; …..
NA, 0.7] Id NA
43. Context assignment – temporal
Example 1 (1/3)
Should the ‘Suspicious observation’ event be assigned to the context segment ‘July
16, 2012, 10AM-11AM’ ?
Suspicious Observation
Mega suspect detection
Certainty 0.8
Occurrence time Uni(9:45AM,10:05AM)
Id ‘John Doe’ Context segment: ‘John Doe’,
‘July 16, 2012, 10AM-11AM’
…..
......
44. Context assignment – temporal
Example 1 (2/3)
The ‘uncertainty propagation’ approach
Prob{occurrence time ∊ [10AM,11AM]} * certainty
Suspicious Observation
Mega suspect detection
Certainty 0.2
Occurrence time Uni(9:45AM,10:05AM)
Id ‘John Doe’ Context segment: ‘John Doe’,
‘July 16, 2012, 10AM-11AM’
…..
......
45. Context assignment – temporal
Example 1 (3/3)
The ‘uncertainty flattening’ approach
Prob{occurrence time ∊ [10AM,11AM]} > 0.5
Suspicious Observation
Mega suspect detection
Certainty 0.2
Occurrence time Uni(9:45AM,10:05AM)
Id ‘John Doe’ Context segment: ‘John Doe’,
‘July 16, 2012, 10AM-11AM’
…..
......
46. Context assignment – temporal
Example 2
Should the uncertain ‘Suspicious observation’ event initiate a new context
segment? Participate in an existing context segmentation?
Suspicious observation detection
Suspicious Observation Context : ‘John Doe‘
State: 1 instance
Certainty 0.3
Occurrence time Uni(9:45AM,10:05AM)
Id ‘John Doe’
…..
...... Suspicious observation detection
Context : ‘John Doe’
47. Context assignment – Segmentation-oriented (1/3)
Which of the context segmentations should the ‘Suspicious observation’ event be
assigned to?
Mega suspect detection
Context segment: ‘John Doe1’,
Suspicious Observation
‘July 16, 2012, 10AM-11AM’
Certainty 0.8
Occurrence time Uni(9:45AM,10:05AM)
Id [‘John Doe1’, 0.3;
‘John Doe2’, 0.7]
Mega suspect detection
Context segment: ‘John Doe2’,
‘July 16, 2012, 10AM-11AM’
48. Context assignment – Segmentation-oriented (2/3)
The ‘uncertainty propagation’ approach
observation.certainty * Prob{‘John Doe1’}
Suspicious Observation
Mega suspect detection
Certainty 0.24
Occurrence time Uni(9:45AM,10:05AM) Context segment: ‘John Doe1’,
Id [‘John Doe1’, 0.3;
‘John Doe2’, 0.7] ‘July 16, 2012, 10AM-11AM’
Suspicious Observation Mega suspect detection
Certainty 0.56 Context segment: ‘John Doe2’,
Occurrence time Uni(9:45AM,10:05AM)
Id [‘John Doe1’, 0.3; ‘July 16, 2012, 10AM-11AM’
‘John Doe2’, 0.7]
observation.certainty * Prob{‘John Doe2’}
49. Context assignment – Segmentation-oriented (3/3)
The ‘uncertainty flattening’ approach
Associate event to context segmentation frequent(Id)
Mega suspect detection
Context segment: ‘John Doe1’,
Suspicious Observation
‘July 16, 2012, 10AM-11AM’
Certainty 0.8
Occurrence time Uni(9:45AM,10:05AM)
Id [‘John Doe1’, 0.3;
‘John Doe2’, 0.7]
Mega suspect detection
Context segment: ‘John Doe2’,
‘July 16, 2012, 10AM-11AM’
50. Pattern matching: Top-K (1/3)
(aka event recognition)
What is the location with the most crime indications?
Suspicious
Suspicious
Suspicious
location
location
location
......
......
......
Crime
Crime
Certainty 0.3
…..
Count=
…..
Id …..
Id
Id
Location ‘A’ 11 12 13
Location ‘A’
Location ‘downtown’ Most frequent locations
Order= Pattern: Top-K (K=1)
Suspicious
Suspicious Order by: count (suspicious location)
Suspicious
location
Suspicious
location
location
Context: Month
......location
......
......
Crime ......
Crime
…..Certainty 0.75
Certainty
….. 0.75 Count=
Id …..
Id …..
Id
Location ‘A’
Id
Location ‘A’
Location ‘suburbs’ 10 11 12 13 14
Location ‘suburbs’
51. Pattern matching: Top-K (2/3)
The ‘uncertainty propagation’ approach
Suspicious
Suspicious
Suspicious
location
location
location Most frequent locations
......
......
......
Crime
Crime
Certainty 0.3
…..
Count1= Pattern: Top-K (K=1)
…..
Id ….. Order by: count (Suspicious location)
Id
Id
Location ‘A’ 11 12 13
Location ‘A’
Location ‘downtown’
Context: Month
Suspicious (‘downtown’,’suburbs’) w.p. Prob{count1>count2}
Suspicious
Suspicious Order=
location
Suspicious
location
location (‘downtown’,’suburbs’) w.p. Prob{count1>count2}
......location
......
......
Crime ......
Crime
…..Certainty 0.75
Certainty
….. 0.75 Count2=
Id …..
Id …..
Id
Location ‘A’
Id
Location ‘A’
Location ‘suburbs’ 10 11 12 13 14
Location ‘suburbs’
52. Pattern matching: Top-K (3/3)
The ‘uncertainty flattening’ approach
Suspicious
Suspicious
location
location
......
......
Crime
Crime
…..
Count=
Id…..
Location ‘A’
Id 11 12 13
Top suspicious
Location ‘downtown’
Most frequent locations location list
......
Certainty 0.3
Pattern: Top-K (K=1)
Suspicious
Suspicious …..
Suspicious
location Order by: f(count(suspicious location)) Id
Suspicious
location
location Context: Month list ‘downtown’
......location
......
......
Crime ......
Crime
…..Certainty 0.75
Certainty
….. 0.75 Count=
Id …..
Id …..
Id
Location ‘A’
Id
Location ‘A’
Location ‘suburbs’ 10 11 12 13 14
Location ‘suburbs’
e.g. expectation, cdf, …
53. Pattern matching: Sequence (1/3)
Should the following events instances be matched as a
sequence?
Suspicious Observation
Certainty 0.8
Occurrence time Uni(9:45AM,10:05AM)
Id ‘John Doe’
…..
......
Crime report matching
Pattern: Sequence [Suspicious observation,
Crime report]
Context: Location, Crime type
Crime report
Certainty 0.9
Occurrence time 10:02AM
Id NA
…..
......
54. Pattern matching: Sequence (2/3)
The ‘uncertainty propagation’ approach
Suspicious Observation obs.certainty * crime.certainty * Prob{obs.time<crime.time}
Certainty 0.8
Occurrence time Uni(9:45AM,10:05AM)
Id ‘John Doe’
….. Matched crime
...... Crime report matching
Certainty 0.612
Occurrence time Uni(9:45AM,10:02AM)
Pattern: Sequence Id ‘John Doe’
Context: Location, Crime type
…..
Crime report ......
Certainty 0.9
Occurrence time 10:02AM
Id NA
…..
...... obs.time | obs.time<crime.time
55. Pattern matching: Sequence (3/3)
The ‘uncertainty flattening’ approach
Occurrence time → percentile(occurrence time, 0.5)
Suspicious Observation
Certainty 0.8
Occurrence time Uni(9:45AM,10:05AM)
Id ‘John Doe’
….. Matched crime
...... Crime report matching
Certainty 0.72
Occurrence time 9:55AM
Pattern: Sequence Id ‘John Doe’
Context: Location, Crime type
…..
Crime report ......
Certainty 0.9
Occurrence time 10:02AM
Id NA
…..
......
56. Event processing under uncertainty tutorial
Part IV:
Event Recognition under Uncertainty in
Artificial Intelligence
57. Event Recognition under Uncertainty in Artificial
Intelligence
AI-based systems already deal with various types of uncertainty:
Erroneous input data detection.
Imperfect composite event definitions.
Overview:
Logic programming.
Markov logic networks.
58. Human Recognition from Video: Bilattice-based Reasoning
Aim:
Recognise humans given input data from part-based classifiers
operating on video.
Uncertainty:
Erroneous input data detection.
Imperfect composite event definitions.
Approach:
Extend logic programming by incorporating a bilattice:
Consider both supportive and contradicting information about
a given human hypothesis.
For every human hypothesis, provide a list of justifications
(proofs) that support or contradict the hypothesis.
60. Human Recognition from Video
Evidence FOR hypothesis “Entity1 is a human”:
Head visible.
Torso visible.
Scene is consistent at 1’s position.
61. Human Recognition from Video
Evidence AGAINST hypothesis “Entity1 is a human”:
Legs occluded.
However, the missing legs can be explained because of the
occlusion by the image boundary.
If occluded body parts can be explained by a rule, the
hypothesis is further supported.
62. Human Recognition from Video
Evidence FOR hypothesis “Entity A is a human”:
A is on the ground plane.
Scene is consistent at A’s position.
63. Human Recognition from Video
Evidence AGAINST hypothesis “Entity A is a human”:
No head detected.
No torso detected.
No legs detected.
Evidence against the hypothesis is overwhelming, so the
system infers that A is unlikely to be a human.
64. Human Recognition from Video
Recognising humans given three different sources of
information, supporting or contradicting human hypotheses:
human(X , Y , S) ← head(X , Y , S)
human(X , Y , S) ← torso(X , Y , S)
¬human(X , Y , S) ← ¬scene consistent(X , Y , S)
Supporting information sources are provided by head and
torso detectors.
Contradicting information is provided by the violation of
geometric constraints (scene consistent).
65. Human Recognition from Video:
Uncertainty Representation
We are able to encode information for and against:
Input data (output of part-based detectors)
Rules (typically they are less-than-perfect)
Hypotheses (inferrable atoms)
We do this by defining a truth assignment function:
ϕ:L→B
where L is a declarative language and B is a bilattice.
Some points in the bilattice:
< 0, 0 > agnostic
< 1, 0 > absolute certainty about truth
< 0, 1 > absolute certainty about falsity
< 1, 1 > complete contradiction
66. Human Recognition from Video
A perfect part-based detector would detect body parts and
geometrical constraints with absolute certainty.
head(25 , 95 , 0 .9 )
torso(25 , 95 , 0 .9 )
¬scene consistent(25 , 95 , 0 .9 )
67. Human Recognition from Video: Input Data Uncertainty
However, in realistic applications, every low-level information
is detected with a degree of certainty, usually normalized to a
probability.
ϕ(head(25 , 95 , 0 .9 )) =< 0 .90 , 0 .10 >
ϕ(torso(25 , 95 , 0 .9 )) =< 0 .70 , 0 .30 >
ϕ(¬scene consistent(25 , 95 , 0 .9 )) =< 0 .80 , 0 .20 >
For simple detectors, if the probability of the detection is p,
the following holds:
ϕ(detection) =< p, 1 −p >
68. Human Recognition from Video
Traditional logic programming rules are binary:
human(X , Y , S) ← head(X , Y , S)
human(X , Y , S) ← torso(X , Y , S)
¬human(X , Y , S) ← ¬scene consistent(X , Y , S)
69. Human Recognition from Video: Rule Uncertainty
But in a realistic setting, we’d like to have a measure of their
reliability:
ϕ(human(X , Y , S) ← head(X , Y , S)) =< 0 .40 , 0 .60 >
ϕ(human(X , Y , S) ← torso(X , Y , S)) =< 0 .30 , 0 .70 >
ϕ(¬human(X , Y , S) ← ¬scene consistent(X , Y , S)) =
< 0 .90 , 0 .10 >
Rules are specified manually.
There exists an elementary weight learning technique: The
value x in <x,y> is the fraction of the times that the head of
a rule is true when the body is true.
70. Human Recognition from Video: Uncertainty Handling
It is now possible to compute the belief for and against a
given human hypothesis, by aggregating through all the
possible rules and input data.
This is done by computing the closure over ϕ of multiple
sources of information.
E.g. if we have a knowledge base S with only one rule and one
fact that together entail human
S = human(X , Y , S) ← head(X , Y , S), head(25 , 95 , 0 .9 )
the degree of belief for and against the human hypothesis
would be:
cl(ϕ)(human(X , Y , S)) = cl(ϕ)(p) =
p∈S
cl(ϕ) human(X , Y , S) ← head(X , Y , S) ∧
cl(ϕ) head(25 , 95 , 0 .9 ) = · · · =< xh , yh >
71. Human Recognition from Video: Uncertainty Handling
Because we have a variety of information sources to entail a
hypothesis from (or the negation of a hypothesis), the
inference procedure needs to take into account all possible
information sources.
The operator ⊕ is used for this.
The final form of the equation that computes the closure over
the truth assignment of a hypothesis q is the following:
cl(ϕ)(q) = cl(ϕ)(p) ⊕ ¬ cl(ϕ)(p)
S|=q p∈S S|=¬q p∈S
Aggregating supporting Negating contradicting
information information
72. Human Recognition from Video: Uncertainty Handling
Given this uncertain input
ϕ(head(25 , 95 , 0 .9 )) =< 0 .90 , 0 .10 >
ϕ(torso(25 , 95 , 0 .9 )) =< 0 .70 , 0 .30 >
ϕ(¬scene consistent(25 , 95 , 0 .9 )) =< 0 .80 , 0 .20 >
and these uncertain rules
ϕ(human(X , Y , S) ← head(X , Y , S)) =< 0 .40 , 0 .60 >
ϕ(human(X , Y , S) ← torso(X , Y , S)) =< 0 .30 , 0 .70 >
ϕ(¬human(X , Y , S) ← ¬scene consistent(X , Y , S)) =
< 0.90, 0.10 >
we would like to calculate our degrees of belief for and against
the hypothesis human(25 , 95 , 0 .9 ).
74. Human Recognition from Video: Summary
Uncertainty:
Erroneous input data detection.
Imperfect composite event definitions.
Features:
Consider both supportive and contradicting information about
a given hypothesis.
For every hypothesis, provide a list of justifications (proofs)
that support or contradict the hypothesis.
Note:
Human detection is not a typical event recognition problem.
75. Public Space Surveillance: VidMAP
Aim:
Continuously monitor an area and report suspicious activity.
Uncertainty:
Erroneous input data detection.
Approach:
Multi-threaded layered system combining computer vision and
logic programming.
76. VidMAP: Architecture
High-Level Module
Standard logic programming reasoning
Mid-Level Module
Uncertainty elimination
Low-Level Module
Background Subtraction, Tracking and Appearance Matching on
video content
78. VidMAP: Uncertainty Elimination
Filter out noisy observations (false alarms) by first
determining whether the observation corresponds to people or
objects that have been persistently tracked.
A tracked object is considered to be a human if:
It is ‘tall’.
It has exhibited some movement in the past.
A tracked object is considered to be a ‘package’ if:
It does not move on its own.
At some point in time, it was attached to a human.
79. VidMAP: Event Recognition
The following rule defines theft:
theft(H, Obj, T ) ←
human(H),
package(Obj),
possess(H, Obj, T ),
not belongs(Obj, H, T )
A human possesses an object if he carries it.
An object belongs to a human if he was seen possessing it
before anyone else.
80. VidMAP: Event Recognition
The following rule defines entry violation:
entry violation(H) ←
human(H),
appear (H, scene, T1 ),
enter (H, building door , T2 ),
not priviliged to enter (H, T1 , T2 )
‘building door’ and ‘scene’ correspond to video areas that can
be hard-coded by the user at system start-up.
An individual is privileged to enter a building if he swipes his
ID card in the ‘building door’ area of ‘scene’ or if he is
escorted into the building by a friend who swipes his ID card.
81. VidMAP: Summary
Uncertainty:
Erroneous input data detection.
Features:
Uncertainty elimination.
Intuitive composite event definitions that are easy to be
understood by domain experts.
Note:
Crude temporal representation.
82. Probabilistic Logic Programming: Event Calculus
Aim:
General-purpose event recognition system.
Uncertainty:
Erroneous input data detection.
Approach:
Express the Event Calculus in ProbLog.
83. The Event Calculus (EC)
A Logic Programming language for representing and reasoning
about events and their effects.
Key Components:
event (typically instantaneous)
fluent: a property that may have different values at different
points in time.
Built-in representation of inertia:
F = V holds at a particular time-point if F = V has been
initiated by an event at some earlier time-point, and not
terminated by another event in the meantime.
84. ProbLog
A Probabilistic Logic Programming language.
Allows for independent ‘probabilistic facts’ prob::fact.
Prob indicates the probability that fact is part of a possible
world.
Rules are written as in classic Prolog.
The probability of a query q imposed on a ProbLog database
(success probability) is computed by the following formula:
Ps (q) = P( fi )
e∈Proofs(q) fi ∈e
87. Uncertainty Elimination vs Uncertainty Propagation
Crisp-EC (uncertainty elimination) vs ProbLog-EC (uncertainty
propagation):
Crisp-EC: input data with probability < 0.5 are discarded.
ProbLog-EC: all input data are kept with their probabilities.
ProbLog-EC: accept as recognised the composite events that
have probability > 0.5.
93. Uncertainty Elimination vs Uncertainty Propagation
ProbLog-EC clearly outperforms Crisp-EC when:
The environment is highly noisy (input data probability < 0.5)
— realistic assumption in many domains,
there are successive initiations that allow the composite
event’s probability to increase and eventually exceed the
specified (0.5) threshold, and
the amount of probabilistic conjuncts in an initiation condition
is limited.
94. ProbLog-EC: Summary
Uncertainty:
Erroneous input data detection.
Features:
Built-in rules for complex temporal representation.
Note:
Independence assumption on input data is not always
desirable.
95. Probabilistic Graphical Models
E.g. Hidden Markov Models, Dynamic Bayesian Networks and
Conditional Random Fields.
Can naturally handle uncertainty:
Erroneous input data detection.
Imperfect composite event definitions.
Limited representation capabilities.
Composite events with complex relations create models with
prohibitively large and complex structure.
Various extensions to reduce the complexity.
Lack of a formal representation language.
Difficult to incorporate background knowledge.
96. Markov Logic Networks (MLN)
Unify first-order logic with graphical models.
Compactly represent event relations.
Handle uncertainty.
Syntactically: weighted first-order logic formulas (Fi , wi ).
Semantically: (Fi , wi ) represents a probability distribution
over possible worlds.
P(world) ∝ exp ( (weights of formulas it satisfies))
A world violating formulas becomes less probable, but not
impossible!
97. Event Recognition using MLN
Input Markov Logic Networks Output
(evidence) Markov Network P(CE =True | SDE)
CE (probabilistic inference)
Knowledge
Base
Recognised
SDE Grounding
CE
99. Markov Logic: Representation
Weight: is a real-valued number.
Higher weight −→ Stronger constraint.
Hard constraints:
Must be satisfied in all possible worlds.
Infinite weight values.
Background knowledge.
Soft constraints:
May not be satisfied in some possible worlds.
Strong weight values: almost always true.
Weak weight values: describe exceptions.
100. Markov Logic: Input Data Uncertainty
Sensors detect events with:
Absolute certainty
A degree of confidence
Example:
holds Dangerous Object(Id) detected with probability 0.8.
It can be represented with an auxiliary formula.
The value of the weight of the formula is the log-odds of the
probability.
1.38629 obs holds Dangerous Object(Id) ⇒
holds Dangerous Object(Id)
101. Markov Logic: Network Construction
Formulas are translated into clausal form.
Weights are divided equally among the clauses of a formula.
Given a set of constants from the input data, ground all
clauses.
Ground predicates are Boolean nodes in the network.
Each ground clause:
Forms a clique in the network,
and is associated with wi and a Boolean feature.
1
P(X = x) = Z exp ( i wi ni (x))
Z= x ∈X exp( i wi ni (x ))
107. Markov Logic: Inference
Event recognition: querying about composite events.
Large network with complex structure.
Exact inference is infeasible.
MLN combine logical and probabilistic inference methods.
Given some evidence E = e, there are two types of inference:
1. Most Probable Explanation:
argmax(P(Q = q|E = e))
q
e.g. Maximum Satisfiability Solving.
2. Marginal:
P(Q = true|E = e)
e.g. Markov Chain Monte Carlo sampling.
109. Markov Logic: Machine Learning
Structure learning:
First-order logic formulas.
Based on Inductive Logic Programming techniques.
Weight estimation:
Structure is known.
Find weight values that maximise the likelihood function.
Likelihood function: how well our model fits to the data.
Generative learning.
Discriminative learning.
110. Markov Logic: Discriminative Weight Learning
In event recognition we know a priori:
Evidence variables: input data, eg simple events
Query variables: composite events
Recognise composite events given the input data.
Conditional log-likelihood function:
log Pw (Q = q | E = e) = i wi ni (e, q) − log Ze
Joint distribution of query Q variables given the evidence
variables E .
Conditioning on evidence reduces the likely states.
Inference takes place on a simpler model.
Can exploit information from long-range dependencies.
111. MLN-based Approaches
Most of the approaches:
Knowledge base of domain-depended composite event
definitions.
Weighted definitions.
Input uncertainty expressed using auxiliary formulas.
Temporal constraints over successive time-points.
More expressive methods:
Based on Allen’s interval logic.
Event Calculus.
112. Event Calculus in MLN
Aim:
General-purpose event recognition system.
Uncertainty:
Imperfect composite event definitions.
Approach:
Express the Event Calculus in Markov logic.
113. The Event Calculus
A logic programming language for representing and reasoning
about events and their effects.
Translation to Markov Logic is therefore straightforward.
Key Components:
event (typically instantaneous)
fluent: a property that may have different values at different
points in time.
Built-in representation of inertia:
F = V holds at a particular time-point if F = V has been
initiated by an event at some earlier time-point, and not
terminated by another event in the meantime.
115. Event Calculus in MLN
1.0
0.5
0.0 Time
0 3 10 20
Initiated Initiated Terminated
Hard-constrained inertia rules:
2.3 CE initiatedAt T if 2.5 CE terminatedAt T if
[Conditions] [Conditions]
¬(CE holdsAt T ) iff CE holdsAt T iff
¬(CE holdsAt T −1), CE holdsAt T −1,
¬(CE initiatedAt T −1) ¬(CE terminatedAt T −1)