Erik van Veenendaal

Improve IT Services BV, Bonaire

Erik van Veenendaal www.erikvanveenendaal.nl is a leading international consultant and trainer, and a recognized expert in the area of software testing and requirement engineering. He is the author of a number of books and papers within the profession, one of the core developers of the TMap testing methodology and the TMMi test improvement model, and currently the CEO of the TMMi Foundation. Erik is a frequent keynote and tutorial speaker at international testing and quality conferences. For his major contribution to the field of testing, Erik received the European Testing Excellence Award (2007) and the ISTQB International Testing Excellence Award (2015). You can follow Erik on twitter via @ErikvVeenendaal.

Test Process Improvement with TMMi – also in the Agile Era!

The Test Maturity Model integration (TMMi) has a rapidly growing uptake and is now the de-facto world-wide standard for test process improvement. It’s growing popularity is based upon it being the only independent test process improvement model, and the simple presentation of maturity levels that it provides. The mistaken belief is that the TMMi and Agile approaches are at odds. Agile approaches and TMMi can not only coexist, but successfully integrate to bring substantial benefits to both Agile and traditional development and testing organizations. This tutorial will show with examples that TMMi and Agile methods can effectively work together. The challenge is to apply lean principles and practices to empower Agile practices and facilitate TMMi practices. Whatever you do a key to success is always to have the business needs and objectives drive the improvement process.

Erik has much practical experience in implementing the TMMi model, in both traditional and Agile environments, and assisting organizations improve the way they test. Many practical experiences and results achieved are presented.

Key learning points at this tutorial are:

• to have an overview and understand the background of the TMMi
• to understand the basics of test process improvement
• to be able to assess test maturity using the TMMi as a reference model
• to understand how to prioritize, define and implement practical test improvements
• how to apply and use the TMMi beneficially in both traditional and Agile environments

Geoff Thompson

Planit Testing, UK

Geoff is the UK Director of Testing Services for Planit Testing, part of the global Planit Testing group. He has a real passion for software testing, test management and process improvement.
He is a founder member of the International Software Testing Qualification Board (ISTQB), the TMMi Foundation, and the UK Testing Board and is currently the Secretary of the ISTQB and Chairman of the UK Testing Board. He co-authored the BCS book Software Testing - An ISEB/ISTQB foundation and is a recognized international speaker, keynoting in many conferences, and was the chair of EuroSTAR 2011. Geoff is Vice Chairman of the SIGiST and its Treasurer. In 2008 Geoff was awarded the European Testing Excellence Award, and in 2015 he was awarded the Software Testing European Lifetime Achievement award.

Session Based Testing

Geoff will introduce what Session Based Testing is, its relationship to Agile and how test techniques play a large part in it. Geoff will also cover off Exploratory Testing and how to manage session based testing, before heading into a practical session putting into place the learnings from this tutorial.

Graham Bath

T-Systems, Germany

Graham Bath is a principal consultant at T-Systems in the division Digital Integration, Agile Testing and uses over 30 years of testing experience to support customers with consultancy, training and test process improvements. Graham is the ISTQB Working Party chair for the Expert Level Certified Tester qualification and co-authored the new syllabus on Usability Testing. Graham is also a member of the German Testing Board as is a frequent presenter and tutorial provider at conferences around the world. He co-authored the book “The Software Test Engineer’s Handbook”.

Usability Testing in a Nutshell

Many of our modern-day applications rely on good usability to make an impact on the market. Even if an application functions perfectly, having bad usability is likely to result in its failure. Considering its critical importance, it’s surprising how many projects approach the issue of usability; it’s often considered as something that can be “implicitly” done at the same time as functional testing, or is scheduled as an optional extra just prior to release.

This tutorial will open your eyes to the importance of usability and covers the principal approaches adopted in evaluating usability, user experience and accessibility. The “in a nutshell” format provides attendees with a thorough and structured overview of where usability fits into our overall testing strategy and provides a springboard for those wishing to develop their skills further.

Paul Gerrard

Gerrard Consulting, UK

Paul Gerrard is a consultant, teacher, author, webmaster, developer, tester, conference speaker, rowing coach and a publisher. He has conducted consulting assignments in all aspects of software testing and quality assurance, specialising in test assurance. He has presented keynote talks and tutorials at testing conferences across Europe, the USA, Australia, South Africa and occasionally won awards for them.

Paul wrote, with Neil Thompson, “Risk-Based E-Business Testing” and several other Pocketbooks - “The Tester’s Pocketbook”, “The Business Story Pocketbook”, "Lean Python" and “Digital Assurance”.

He is Principal of Gerrard Consulting Limited, Director of TestOpera Limited and is the host of the Assurance Leadership Forum in the UK.

Problem Solving for Testers

In some organisations, it is perfectly fine for testers to report failure as they experience them. To capture the details of behaviour that does not meet expectations, how to reproduce the problem and an assessment of severity and/or priority might provide enough information to allow developers to diagnose and debug the problem. But in many situations, this simply does not work. For example, in a company that builds hardware and writes their own firmware and application software to diagnose the source of a problem can be a difficult task. Where a device has many, many configurations or connects to a range of other hardware, firmware or software applications it might be impossible to reproduce the problem outside the test lab. In this tutorial, Paul explores how we can be deceived by evidence and how we can improve our thinking to be more certain of conclusions. You’ll practice the design experiments, recognise what you can and cannot control, learn how to systematically diagnose the causes of failure and work as a team to problem solve more effectively.

Maaret Pyhäjärvi

F-Secure Corporation, Finland

Maaret Pyhäjärvi is a software professional with testing emphasis. She identifies as an empirical technologist, a tester and a programmer, a catalyst for improvement and a speaker. Her day job is working with a software product development team as a hands-on testing specialist with focus on exploratory testing. In addition to being a tester and a teacher, she is a serial volunteer for different non-profits driving forward the state of software development. She was recently awarded as Most Influential Agile Testing Professional Person 2016. She blogs regularly at http://visible-quality.blogspot.fi and is the author of two LeanPub books.

Exploratory Testing Explained and Experienced

This session sets out to clarify through shared experiences from exercises what it means to do exploratory testing. We look what it is and how it is done, and why should you care to include the exploratory testing perspective in your projects. Exploratory testing (ET) isn’t new but it has evolved further in agile projects with regression test automation. Great automation is created through exploratory mindset, and there’s a place for throwaway automation in exploratory testing.

This workshop helps teams create a solid approach of exploratory testing that benefits from versatile automation ideas.

Jan Jaap Cannegieter

Squerist, Netherlands

Jan Jaap Cannegieter has 20 years of experience in ICT, he did assignments in testing, quality assurance, process improvement, requirements, Agile, digitalization and requirements. Jan Jaap is now Principal Consultant at Squerist, a consultancy company of 90 employees specialized in process management and testing, and Test Manager at NEN, the Dutch Normalization Institute. Within Squerist Jan Jaap is responsible for coaching, knowledge management and product development. Jan Jaap is the well-known author of several articles and books in the Netherlands.

Test Management in Agile

Agile development is changing the way we test very much. And in theory there is no test managers role in Agile organizations anymore. In this half-a-day tutorial we will see how test management activities can be organized in Agile organizations, we will see how you can organize the different test levels en how you can decide which way of testing (scripted, session based, exploratory testing) is most appropriate in your situation. Although there is in theory no role for a test manager in Agile I see test manager like roles in practice. We will discuss which role is applicable in which situation. By attending to this tutorial you will understand how test management activities can be organized in Agile organizations and, in case you are a test manager right now, what possible roles you can do in the future. This tutorial is relevant or every test manager and for everybody who works at an organization that is implementing agile or has plans to implement agile.