Newsletter Subject

Practicing systems thinking to improve your software testing

From

ministryoftesting.com

Email Address

hello@ministryoftesting.com

Sent On

Wed, Nov 6, 2024 07:30 AM

Email Preheader Text

Discover how everyday systems thinking can be harnessed to improve your testing approach Practicing

Discover how everyday systems thinking can be harnessed to improve your testing approach [View this email in your browser]( Practicing systems thinking to improve your software testing by Konstantinos Konstantakopoulos | [Read online at Ministry of Testing]( "A comprehensive understanding of the system under test and supporting systems like CI / CD allows me to test systems effectively because I know how they work inside out. With this knowledge, I am capable of comprehending the operation of the application, identifying where a bug belongs, and most importantly, contemplating the test strategy." Good news: you're already a systems thinker! This article is inspired by real-life experiences that shaped my approach to software testing. Simply put: the activities of daily life require you to think about systems, no matter whether they involve "technology." Once you realize you're doing systems thinking already, you can go deeper into understanding the systems you are testing on the job. In your daily life, do you: - Plan meals, go shopping for food, make and stick to a budget, and get well-prepared meals on the table in time for dinner? - Take kids to school and then get to work on time? - Fix things around the house, or learn how to walk others through the apparent problem? Guess what: you're already a systems thinker! And this ability will serve you very well as a software tester. But you may need to develop the skills and practice it more deeply than you are doing right now. Confronting a "broken" system: learning about my garage door As a husband and parent, I often find myself solving household issues, like fixing something in the house or assembling a new toy for my son. But it was dealing with a broken remotely-operated garage door that motivated me to draft this article. I realized there was a lot of overlap in the way I approached learning about that problem and how I approach learning situations as a tester. Before asking for help or fixing things, I want to know how things work. When the garage door would no longer open with the remote control, I first searched the Internet to learn about its mechanism and how to fix it by myself. But I also feared that any action I took might worsen the problem. So I called for help, but given my lack of understanding, I could not communicate effectively with the service professional. He asked me questions using terms that I didn’t know, leaving us both frustrated. So, I continued searching for information about remotely operated garage doors. I discovered that the door operates via a mechanical motor connected to the house's power supply. The door rolls up and down on three iron tracks connected to the motor. Although I eventually needed a repairperson to fix the door, I gained valuable insight into its workings and I was now in a good position to chat with the repairperson in depth about what was wrong. I was also a lot less worried about my ability to handle future problems with the door. Continuous improvement in systems thinking: when the garage door broke again Sure enough, a few months later, the garage door stopped working again. This time, it made a huge noise that made me think it was broken beyond repair. That confidence I'd had after the first incident went out the window, the same way it does when a fix to a code issue stops working in a production environment. I called the repairperson again, and they applied another "hotfix" by adjusting one of the iron tracks. During the repair process, I asked questions about the issue, how it happened, and I also monitored the repairperson's actions to understand how they applied the fix. The next day, when I returned from picking up my son at school, I became concerned about using the garage door multiple times per day. Perhaps heavy use was the culprit in its malfunction. Then I realized that I understood completely how it worked. I had seen the inner parts, and understood how they worked together. Should another problem emerge in the future, I thought, I should be able to handle it or understand how the repair professional handles it. This same logic applies to how I approach software testing. Building up your systems thinking practice in software testing Sometimes, our jobs call us to go deeper with systems thinking whether or not we think we're ready. This is what happened to me. A good start: thorough product knowledge A few years ago, I worked as a tester in a startup that grew from a team of only 50 people to over 1,000 employees (Beat, formerly Taxibeat, now part of FreeNow). As the engineering department expanded, teams began taking ownership of specific areas to scale effectively. When I joined the company, I was responsible for testing the company's mobile application, covering regression testing when a new feature was planned and performing regression testing cycles for mobile apps across five countries. Later on, I focused on the incentive features for both passenger and driver mobile applications. In my early years as a tester, I mainly conducted user acceptance testing and functional testing. I used tools like Postman for API testing, and Selenium and Appium for web and mobile UI test automation. My greatest strength was my deep understanding of the product area I was responsible for testing. When deep product knowledge isn't enough… All was well until our team took ownership of a new product area. I suddenly felt lost. We had to configure product settings via a new microservice connected to another service, which retrieved data from the Data Science department. All of this was managed through a backend monolith connected to an old admin panel. And the end-user functionality had to be available in both iOS and Android mobile applications, for both taxi passengers AND drivers. If you changed anything in the admin panel, with its multiple integration points, you ran a risk of breaking the configuration entirely. This situation was much like that of the garage door — an unknown system I needed to understand thoroughly before considering any test plan. Embracing a deep understanding of systems Sorting the nuts and bolts of system configuration I discussed my concerns with my manager and we agreed that being in charge of testing, on its own, wasn't an adequate safeguard. We decided that I also should "own" the admin panel and microservices configuration area so that I would learn their workings inside and out. I spent many days learning the admin panel settings, reading all available documentation, and mapping API endpoints for each service. I studied how the Data Science team sent information to microservices through Kafka events, and I set up a local Kafka broker connected to the staging environment to understand the message flow. Then I mapped each setting in the admin panel to the endpoints used by our mobile apps. That was the hard part because there was no current API documentation for the admin panel. And I got access to the GitHub repositories for the admin panel, backend, and microservices, cloning each repo and running them locally to understand the logic behind each setting. Creating learning pathways: exploring product code I didn't feel like a software engineer at first, but step by step, that changed. I began studying git and version control, installed my first IDEs on my laptop, and used text editors like Sublime to open repositories and search code. I learned about design patterns, event-driven architecture, and microservices. I also explored repositories written in PHP, JavaScript, and Go, while our tests were written in Java, exposing myself to different programming languages and concepts. I acquired an experimentation mindset. If there was anything I didn't understand, I searched the Internet and consulted the developers on my team. I subscribed to multiple newsletters and blogs, followed experts in the testing field, and began daily study of engineering concepts. This deep dive into the engineering world and the system's workings became a norm for me. I now know that daily commitment to continuous learning and research is a must-have for testers and engineers. Establishing a regular practice of systems thinking in test Today, I regularly take deep dives into the systems I work on. I gain access to all repositories, architectural diagrams, API documentation, [C4 diagrams]( and business logic documents. I try to understand the continuous integration and delivery (CI / CD) process and how the test environments are deployed. A comprehensive understanding of the system under test and supporting systems like CI / CD allows me to test systems effectively because I know how they work inside and out. With this knowledge, I am capable of comprehending the operation of the application, identifying where a bug belongs, and most importantly, contemplating the test strategy. My systems thinking practices result in these actions on my part: - I make sure that I understand the risks. I know the areas that should be covered when a particular aspect of code changes, the layers that are tested, and those that are not. - I evaluate test coverage at lower levels such as integration tests in the application code. - I understand whether or not we incorporate any end-to-end tests in our deployment process. By the book, an end-to-end test describes a business process or a user flow, meaning we need the whole system deployed somewhere. - I map acceptance criteria and requirements to test scenarios. As the book Thinking in Systems by Donella Meadows beautifully describes: "The behavior of a system is its performance over time — its growth, stagnation, decline, oscillation, randomness, or evolution. If the news did a better job of putting events into historical context, we would have better behavior-level understanding, which is deeper than event-level understanding. When a systems thinker encounters a problem, the first thing he or she does is look for data, time graphs, the history of the system. That’s because long-term behavior provides clues to the underlying system structure. And structure is the key to understanding not just what is happening, but why." Tester or engineer? All roles need good systems thinking practices Reading business requirements and acceptance criteria is essential to understand customer needs, but it's not enough for effective testing. Without knowing the system thoroughly, you’ll face too many unknowns that slow down the team. Effective communication about quality issues, for instance, requires detailed knowledge of the system to pinpoint issues accurately. There have been times when I identified a problem during testing, considering it as a defect, only to realize it was due to a test setup that wouldn’t occur in the real world. This deeper understanding of systems has made me a better engineer. It’s crucial to recognize that as a professional in the software testing field, no matter what your official role is, you are a software engineer responsible for the system's quality. You must know how your system works from the backend to the frontend to deliver the best quality to customers. In my current position as a Principal Software Engineer in Test at Orfium, I strongly advocate for a role that combines both testing and engineering mindsets. This approach is reflected in our hiring process, job postings, onboarding procedures, and the guidelines we have established. The role of the tester within the company should be closely integrated with the development team and be a part of the engineering department. And the tester must thoroughly understand the system under test so that it aligns with the guidelines and mission we aim to serve. Good test automation depends on disciplined systems thinking Immersing yourself in how your product's various components interact can significantly elevate your test automation efforts. Approaching automated checks — my preferred term for automated tests — with a systems-thinking mindset enhances your confidence in both test planning and design, resulting in meaningful tests. Each time these tests run, they will allow you to gather valuable feedback which reflects your software’s business logic and actual user flows, enabling you to communicate the quality status of your system confidently before and after releases. Ultimately, this contributes to the overall quality of your software. As your software evolves, consistent maintenance of your test automation projects is paramount to ensure they continue providing meaningful feedback to your teams. Embracing a systems mindset equips you with the skills needed to implement, maintain, and scale your test code effectively. You’ll find satisfaction in exploring essential questions such as why to test, what to test, how to test, and when to run tests within your team’s CI / CD process. Cultivating a mindset rooted in systems thinking not only enhances the efficiency of your test automation but also inspires your team’s engineers to embrace this approach, significantly improving the overall quality of how your organization develops software. To wrap up The discipline of understanding systems deeply, akin to planning a week's meals for your family or troubleshooting a remotely-operated garage door, is vital to effective software testing. You should strive to experiment and learn something new about your application every day, eager to learn and rethink the way you test your software. This can-do mentality, and a practice of curiosity and growth mindset, can permeate your teams. With it, you can embody the true essence of a software engineer in test. For more information - [The Ministry of Testing Community's Guide to Critical Thinking]( - [Hurdles of Testing Large-scale Software Systems]( Michaela Greiler - [The Automation Break Up: Saying Goodbye to Full Stack Tests with Task Analysis]( Mark Winteringham 🚀 [Mot Software Testing Essentials Certificate is here! Early bird ends 22nd December 2024!]( The Essentials Certification from Ministry of Testing is a modern introduction to the world of testing software. It’s created with a forward thinking lens and with current experts from within the software testing industry. Our goal is to equip the next generation of software testers to move forward with confidence in their careers. 👋 Level up your software testing career with MoT Professional Membership. [Available for individuals and teams]( [Website]( [LinkedIn]( [YouTube]( [Instagram]( [TikTok]( Copyright © 2024 Ministry of Testing, All rights reserved. You have opted to join this email list. Our mailing address is: Ministry of Testing 19 New RoadBrighton, East Sussex BN1 1UF United Kingdom [Add us to your address book]( Want to change how you receive these emails? You can [update your preferences]( or [unsubscribe from this list](.

EDM Keywords (252)

wrong written wrap would world workings worked work within window well week web way want vital using update unsubscribe understood understanding understand try troubleshooting times time thought think tests testing testers tester tested test teams team table systems system subscribed studied structure strive stick step startup son software slow skills situation shaped setting set service serve selenium seen searched school scale running role risks risk right returned rethink responsible research requirements repo repairperson reflects reflected recognize receive realized realize ready ran quality professional product problem preferences practice planning planned picking permeate performance passenger part parent paramount overlap orfium opted operation occur nuts norm news needed need must motivated mission ministry microservices mentality mechanism meals matter mapped manager managed malfunction made lot look locally list learned learn layers laptop lack knowledge know key joined join job issue ios internet inspired information individuals incorporate improve identified husband house history help harnessed happening happened handle guidelines guide grew goal go given get future frustrated frontend freenow focused fix first family face experiment evolution established essential equip ensure enough enhances engineers engineer end embrace embody emails email efficiency due drivers draft door discussed discovered discipline developers develop depth deployed deliver defect deeply deeper decided dealing customers curiosity culprit crucial created covered could contributes consulted considering confronting confidence concerns concepts comprehending company communicate combines chat charge changed change capable called budget breaking book bolts behavior backend available assembling asking asked article areas approach applied appium anything also already allow aligns aim agreed activities actions action acquired able ability

Marketing emails from ministryoftesting.com

View More
Sent On

02/12/2024

Sent On

29/11/2024

Sent On

21/10/2024

Sent On

25/09/2024

Sent On

23/09/2024

Sent On

19/09/2024

Email Content Statistics

Subscribe Now

Subject Line Length

Data shows that subject lines with 6 to 10 words generated 21 percent higher open rate.

Subscribe Now

Average in this category

Subscribe Now

Number of Words

The more words in the content, the more time the user will need to spend reading. Get straight to the point with catchy short phrases and interesting photos and graphics.

Subscribe Now

Average in this category

Subscribe Now

Number of Images

More images or large images might cause the email to load slower. Aim for a balance of words and images.

Subscribe Now

Average in this category

Subscribe Now

Time to Read

Longer reading time requires more attention and patience from users. Aim for short phrases and catchy keywords.

Subscribe Now

Average in this category

Subscribe Now

Predicted open rate

Subscribe Now

Spam Score

Spam score is determined by a large number of checks performed on the content of the email. For the best delivery results, it is advised to lower your spam score as much as possible.

Subscribe Now

Flesch reading score

Flesch reading score measures how complex a text is. The lower the score, the more difficult the text is to read. The Flesch readability score uses the average length of your sentences (measured by the number of words) and the average number of syllables per word in an equation to calculate the reading ease. Text with a very high Flesch reading ease score (about 100) is straightforward and easy to read, with short sentences and no words of more than two syllables. Usually, a reading ease score of 60-70 is considered acceptable/normal for web copy.

Subscribe Now

Technologies

What powers this email? Every email we receive is parsed to determine the sending ESP and any additional email technologies used.

Subscribe Now

Email Size (not include images)

Font Used

No. Font Name
Subscribe Now

Copyright © 2019–2024 SimilarMail.