Loading…
CAST 2018 has ended
Sea Oats [clear filter]
Wednesday, August 8
 

09:00 EDT

CAST Welcome
Eric Proegler is a Director of Test Engineering for Medidata Solutions in San Francisco, California.

Eric is the Vice President and Treasurer for the Association for Software Testing. He is also the lead organizer for WOPR, the Workshop on Performance and Reliability. He’s presented at CAST, Agile2015, Jenkins World, STARWEST, Oredev, CodeFest, CMG Impact, STPCon, PNSQC, and multiple WOPRs.

In his free time, Eric spends time with family, leads a science fiction book club, sees a lot of stand-up comedy and live music, seeks out street food from all over, collects beer growlers, plays video games, and follows professional basketball.

Speakers
avatar for Eric Proegler

Eric Proegler

Director, Test Engineering, Medidata Solutions
Eric Proegler is a Director of Test Engineering for Medidata Solutions in San Francisco, California.Eric is the Vice President and Treasurer for the Association for Software Testing. He is also the lead organizer for WOPR, the Workshop on Performance and Reliability. He’s presented... Read More →


Wednesday August 8, 2018 09:00 - 09:15 EDT
Sea Oats

09:15 EDT

Cynefin for Testers
Whenever we do anything new, we make discoveries, and often those discoveries force us to change direction and rethink our goals. In a world which embraces uncertainty, and in which innovation means trying things out and iterating more often than analyzing and predicting, what's the role of a tester?

In a world of change, where a quick reaction to problems is often a better approach than a prediction of them, we look at how a tester's mindset and skills can still bring much-needed clarity, ensuring coherence in the experiments we perform and making sure that they're safe-to-fail.

Speakers
avatar for Liz Keogh

Liz Keogh

Lunivore
Liz Keogh is a Lean and Agile consultant based in London. She is a well-known blogger and international speaker, a core member of the BDD community and a contributor to a number of open-source projects including JBehave.Liz specializes in helping people use examples and stories to... Read More →


Wednesday August 8, 2018 09:15 - 10:30 EDT
Sea Oats

10:45 EDT

An Introduction to Domain Testing: More than Quick Tests
We will be conducting several exercises with debriefing sessions afterwards and small presentations focused on filling in the relevant context.

 Students can expect to:
  • Test a sample application
  • Identify, classify and discuss data types used by variables in the application
  • Determine if said variables are good candidates for either boundary analysis + equivalence class partitioning
  • Generate best representative data and put that information into a classic domain testing table
  • Work with other students to generate and catalog data for later use

Learning Objectives:
  • What Domain Testing is
  • What it is not
  • How to apply it in a systematic way

Testing is a complex activity and developing competence requires mastery over many tasks and practice, practice, practice. Yet there are very few places where we get a chance to practice applying test techniques and get feedback on how we are doing. This is that place. When used appropriately, Domain Testing (an umbrella term for Boundary Analysis and Equivalence class partitioning) can increase our efficiency by helping us run less redundant, more powerful tests. How? We’re often faced with too many possibilities to test, too much data and Domain Testing helps us create a sample of the best representative data to use. In this workshop, we are going to provide an experience that helps all members of the team better understand the types of data we are using and how to craft powerful tests. We’re going to work through a number of examples while testing an application.

Through sampling we’ll find the best representative data we think will cause failures and find bugs. It doesn’t matter how much testing you do, if you do any, you can learn from this course.

So, what about the rest of the team? What if Developers are more aware of Domain Testing in order to increase the effectiveness of their Unit “Checks”? What if Business Analysts are better prepared to provide examples that include better data points from Domain Analysis? Domain Testing is a technique that the whole team can use to reduce risk in the data that the application uses.

Speakers
avatar for Dwayne Green

Dwayne Green

Quality Champion, 1800Contacts
Dwayne Green is a Team Lead of Testing at 1-800 Contacts located in Draper, UT. He has 10 years of industry experience and has a passion for developing skills in testing. He has been active in local testing meetups. Dwayne can be reached on Twitter @n00btester.
avatar for Chris Kenst

Chris Kenst

Automation Engineer, Bloomnation, Inc
Chris Kenst is a Test Automation Engineer at BloomNation working to help them accelerate the achievement of shippable quality.


Wednesday August 8, 2018 10:45 - 12:15 EDT
Sea Oats

14:00 EDT

Coaching Software Testing Using the GROW Model
Coaching is one of the most talked about topics of recent times. So what do you do when you are tasked of coaching developers on software testing? Come and find out as I explore and tell you about my experience using the GROW model to coach software engineers.

Coaching is an almost essential activity in today’s software world. 18 months ago, Redgate Software, the company I work for, decided to shift the testing activities and quality responsibility from testers to software engineers.

The company didn’t want to just drop those responsibilities to engineers without some help so they tasked some people to become Quality Coaches, which is my current job title.

As well as talking out about my journey as a coach, I would like to present an experience report about using the GROW model (from John Whitmore) to coach developers, and other roles including user experience designers, on software testing.

It’s been one of the most interesting techniques I’ve heard about and studied before putting it into practice. It can be applied in any context and so far I had moderate success with it, although it wasn’t all plain sailing and I still have a long way to go.

Takeaways:
  • The different meanings of the word coaching
  • That a quality coach does and doesn’t
  • The 4 GROW model stages
  • Report on techniques to help overcome barriers that I’ve found along the way
  • Real work situations where the GROW model helped and others where it didn’t, and it shouldn’t have been used
  • Additional resources that relate in particular to the testing context

Speakers
avatar for Jose Lima

Jose Lima

Test Engineer, Redgate Software
José started his professional career as a test engineer at Cambridge-based Redgate Software, and has always been an advocate of quality.Last year he became a quality coach in the hope of spreading the lessons he'd learned to the various product teams and software engineers.He spends... Read More →


Wednesday August 8, 2018 14:00 - 15:00 EDT
Sea Oats

15:15 EDT

Recruiting for Potential
In early 2017 I was promoted to QA manager and right off the bat thrown into two recruitment processes. I was terrified. I knew from previous experiences that I am really bad at traditional interviewing techniques and suddenly I could not even hide behind someone else making the decisions. During my career I've interviewed potential colleagues, team members and interns and I've always felt the outcome depended heavily on the candidate’s confidence rather than my questions.
Our recruitment process included three interviews and three online tests. I felt it tended to favor glossy resumes and interview-trained professionals as well as being biased towards whatever personality type the recruiting manager had.

I wanted to do something different. Something that used my testing and programming background and that could be used to assess both juniors and seniors on an even playing field.
I started out looking for available exercises but the things I found were limited, generic and all focused on testing in front of other people. This also favors a particular type of person and in addition it wouldn’t give me all the answers I wanted.

• How well do they read instructions?  
• Do they have the guts to question?
• Can they make reasonable assumptions?
• How do they adapt to something unexpected?
• Can they document and communicate their findings?
• Can they answer questions about their work?
• ...

In this experience report I’ll share my thoughts on why traditional interview processes are outdated and I’ll show you an alternative way or doing it. I’ll talk about successes, setbacks and how we plan to improve the exercise moving forward.

It's about figuring out what makes a tester, how to compare apples to biscuits and how you should always expect the unexpected.

In short: I will talk about putting candidates to the test.

Takeaways:
  • Why standard recruitment processes are biased and focus too much on history
  • Ideas on how to improve recruitment processes for testers or other roles
  • How to design a scope small enough to handle but with enough challenge

Speakers
avatar for Lena Wiberg

Lena Wiberg

AFA Försäkring AB
Lena first entered the IT-industry during the last trembling years of the 20th century. She currently works as a Team manager for QA teams but started out as a programmer, which shows in her chosen methods and approcaches. She has a passion for making quality a state of mind rather... Read More →


Wednesday August 8, 2018 15:15 - 16:15 EDT
Sea Oats

16:30 EDT

Defined by Your Own Tools
If there’s a pencil in your pocket, there’s a good chance that one day you’ll feel tempted to start using it-  Paul Auster, “The Red Notebook”

The tools we use are changing the way we see reality and operate. Opening up the browser’s dev-tools provides visibility to new areas we are unfamiliar with. Learning about consistency heuristics gives a mean to discuss why a behaviour is (or isn’t) desired. Each tool we pick up expands our mind and our options, and many times we don’t know we are missing a tool before actually using it.

In this talk I’ll share my ways of finding new tools through examples of tools that had significant impact on my abilities as a tester in various ways:
  • Encoded knowledge repositories (ZAP or WAVE)
  • Transferable skills (xpath, scripting)
  • Visibility enhancers (web proxy, resource monitor)
  • Simple productivity tools (ditto, excel).

Takeaways:
  • We only test what is easy. Tools make stuff easier.
  • Take tools out of context to discover new ways to use them
  • Explore the tool and find solutions to problems you didn’t know you had
  • Categories of influence, and how to use them

Speakers
avatar for Amit Wertheimer

Amit Wertheimer

RSA
Amit has been testing software in the e-commerce space at RSA for the past 6 years, and enjoying every moment testing, writing code, dealing with security, and learning what it means to be a tester in today’s world. In addition, he helps organize a local meetup group and as a co-editor... Read More →


Wednesday August 8, 2018 16:30 - 17:30 EDT
Sea Oats
 
Thursday, August 9
 

09:00 EDT

What I Talk About When I Talk About Testing
The language we use when describing our testing is important to understanding our craft. This workshop is geared toward clarification of the soft skills of our testing practices. We will share best outcomes to document new talking points which we can apply in our evolving careers as testers.


“What I talk about when I talk about testing” is a workshop geared toward testers and non-testers alike. It will be a short presentation followed by a group discussion. We will share scenarios and situations we have encountered where we had to describe our work, or elaborate on our processes.
This will be an overview of the non-technical side of software testing - the soft skills we bring to the agile process, and the communication and perception challenges that we sometimes face in our careers.

How do you describe the testing that you are doing to the stakeholders on the project? How do you respond to questions in a job interview? How do you interact during discussions with recruiters, or with account people who are selling your skillset to a client? How do you quantify what you do in order to sell yourself and software testing accurately? The language we use is very important in a craft that is often misunderstood.

It’s important to explain that it’s not just what we DO as software testers, it is equally important to show how we interact with other development team members. We need to describe how the testing process often goes much deeper than just ‘checking’ if the software behaves according to requirements. We possess soft skills that are just as important as our technical skills. But how do we convey that?

We will have a group discussion about our interactions with the following individuals and uncover a list of good practices:
* Project Stakeholders
* Product Owners
* Developers
* Scrum masters and project managers
* Account & sales people
* Talent acquisition / Recruiters

Speakers
avatar for Jim Warchol

Jim Warchol

SafeNet Consulting
I am currently employed at SafeNet Consulting in Milwaukee WI as a Software Tester, but I have expanded my skill set to act as a Business Analyst and Scrum Master depending on the needs of my current project team. I have been a software tester for 9+ years, and to date I have had... Read More →


Thursday August 9, 2018 09:00 - 10:30 EDT
Sea Oats

10:45 EDT

Designing Test. Testing Design.
Over the years, I’ve noticed many shared experiences between the “specialist” roles: UX, visual designers, testing, data scientists, architects, security analysts, operations, and even agile coaches.

In particular I started to notice the similarities between UX and testing, including a core interest in the quality and reliability of user interactions (where humans touch the machine), an appreciation for the complexity of those interactions, and a passion to explore “what could go wrong…” in a quest for better (and safer) outcomes.

And one thing everyone shared: trying to “fit” into a framework (Scrum, for most of us) that never really worked out the kinks of our involvement.

Which got me thinking…how might we better design the interface to (and the experience of) working with testing? How might we use service design, org design, product thinking, and systems thinking to address the information asymmetries / blind spots that leave testing in a perpetual state of push-based advocacy? What can testing learn from UX? And what can UX learn from testing?

My hope with this talk is that we can explore these shared experiences and questions with stories from my career, a bit of theory, a lot of practice, some drawings/doodles, and (likely bad) jokes.


Speakers
avatar for John Cutler

John Cutler

John Cutler is keenly focused on user experience and evidence-driven product development.  He mixes and matches various methodologies — jobs-to-be-done, Lean UX, Lean Startup, customer development, and design thinking — to help teams deliver lasting outcomes for their customers.“Team... Read More →


Thursday August 9, 2018 10:45 - 12:00 EDT
Sea Oats

14:00 EDT

Let's Hack Your Team and Organization w/ The Best Inquiry Skills Ever
In this interactive workshop, you will be introduced to and practice some of the basic techniques of Clean Language and Systemic Modeling. Clean Language is a set of 12 basic questions that are non-leading and assumption free. Systemic Modeling is the art of using Clean Language to explore mental models in individuals or groups. It is incredibly powerful in helping a group to discover its own symbols and metaphors, and will lead members to pay exquisite attention to each other. Group communication becomes a platform for sustainable inquiry, learning, clarity and trust.

In this intro to Clean Language and Systemic Modeling workshop, get a taste of some of the questions, techniques and models, such as:
  • Metaphor Warmup, have fun loosening up that analytic brain.
  • Two Lazy Jedi Questions, learn and practice the two most useful questions on earth for inquiry.
  • An Intro to the Structure of Clean Questions, how is the syntax of clean question in fact ‘clean’ ? Receive a copy of all of the questions, and how each is used.
  • Working At Your Best - create and share a symbolic landscape. We’ll use the questions on each other to learn each other’s best state for working.
  • Clean Setup - what questions can people can use to prepare for meetings, facilitations, study sessions, workshops, trainings in a collaborative manner - also using symbols and metaphors?
  • Clean Feedback - you’ll learn to separate observation from meaning and impact. This can be used to reflect back on both positive and negative and to think forward about what might work better.
You’ll use these techniques to encourage curiosity and diversity. You and your team members will have a powerful tool to use to diffuse conflict and misunderstandings, give effective feedback, setup for great collaborative work. You’ll be able to celebrate different opinions and ways of working, and to help enhance each other’s thinking.

Simple enough to start using right away, rich enough that if practiced with others, it will start to amplify your collective results at work.

Speakers
avatar for Andrea Chiou

Andrea Chiou

Enterprise Agile Coach, Tenable
I work remotely as an Enterprise Agile Coach at Tenable, Inc. a cyber-exposure and vulnerability detection company located in Columbia, Maryland. I care about 'vulnerability detection' in the teams I coach too. Just like with IT systems, we'll have to develop awareness of each others... Read More →



Thursday August 9, 2018 14:00 - 15:30 EDT
Sea Oats

15:45 EDT

Wearing Hermione’s Hat:Narratology for Testers
When we find those pesky bugs in production, hindsight often allows us to see what logical leaps and assumptions we made, and how we could have caught it, had we only known. Narratology is study of the construct of stories. I’ll show how to apply its lessons to our software development and testing using examples from the Harry Potter narrative.

Takeaways:
- The basics of narratology: what it is and how to apply it to software development and testing.
- Ways to reexamine your own assumptions and foster critical thinking
- The building bricks of storytelling

Speakers
avatar for Marianne Duijst

Marianne Duijst

Sogeti
Marianne is employed as a Test Specialist at Sogeti, and held previous roles as a Software Engineer, Scrum Master, Developer and High School Teacher. She enjoys speaking about critical thinking, work culture and being a Girl Scout. She sketchnotes live at conferences to share and... Read More →


Thursday August 9, 2018 15:45 - 16:45 EDT
Sea Oats

16:45 EDT

CAST Closing
Let's say goodbye! We'll miss you - until next time!

Speakers
avatar for Eric Proegler

Eric Proegler

Director, Test Engineering, Medidata Solutions
Eric Proegler is a Director of Test Engineering for Medidata Solutions in San Francisco, California.Eric is the Vice President and Treasurer for the Association for Software Testing. He is also the lead organizer for WOPR, the Workshop on Performance and Reliability. He’s presented... Read More →


Thursday August 9, 2018 16:45 - 17:00 EDT
Sea Oats
 
Filter sessions
Apply filters to sessions.