In some ways, software testing and automated testing tools are following similar paths as traditional development. The following is a brief evolution of software development and shows how deviations from prior best practices are also being observed in the software testing process.
The first computers were developed in the 1950s, and FORTRAN was the first 1GL programming language. In the late 1960s, the concept of "structured programming" stated that any program can be written using three simple constructs: simple sequence, if-then-else, and do while statements. There were other prerequisites such as the program being a "proper program" whereby there must exist only one entry and one exit point. The focus was on the process of creating programs.
In the 1970s the development community focused on design techniques. They realized that structured programming was not enough to ensure quality—a program must be designed before it can be coded. Techniques such as Yourdon's, Myers', and Constantine's structured design and composite design techniques flourished and were accepted as best practice. The focus still had a process orientation.
The philosophy of structured design was partitioning and organizing the pieces of a system. By partitioning is meant the division of the problem into smaller sub-problems, so that each subproblem will eventually correspond to a piece of the system. Highly interrelated parts of the problem should be in the same piece of the system; that is, things that belong together should go together. Unrelated parts of the problem should reside in unrelated pieces of the system; for example, things that have nothing to do with one another do not belong together.
In the 1980s, it was determined that structured programming and software design techniques were still not enough: the requirements for the programs must first be established for the right system to be delivered to the customer. The focus was on quality that occurs when the customer receives exactly what he or she wanted in the first place.
Many requirement techniques emerged, such as data flow diagrams (DFDs). An important part of a DFD is a store, a representation of where the application data will be stored. The concept of a store motivated practitioners to develop a logical-view representation of the data. Previously the focus was on the physical view of data in terms of the database. The concept of a data model was then created: a simplified description of a real-world system in terms of data, for example, a logical view of data. The components of this approach included entities, relationships, cardinality, referential integrity, and normalization. These also created a controversy as to which came first: the process or data, a chicken-and-egg argument. Prior to the logical representation of data, the focus was on the processes that interfaced to databases. Proponents of the logical view of data initially insisted that the data was the first analysis focus point and then the process. With time, it was agreed that both the process and data must be considered jointly in defining the requirements of a system.
In the mid-1980s, the concept of information engineering was introduced. It was a new discipline that led the world into the information age. With this approach, there is more interest in understanding how information can be stored and represented, how information can be transmitted through networks in multimedia forms, and how information can be processed for various services and applications. Analytical problem-solving techniques, with the help of mathematics and other related theories, were applied to the engineering design problems. Information engineering stressed the importance of taking an enterprise view of application development rather than a specific application. By modeling the entire enterprise in terms of processes, data, risks, critical success factors, and other dimensions, it was proposed that management would be able to manage the enterprise in a more efficient manner.
During this same time frame, fourth-generation computers embraced microprocessor chip technology and advanced secondary storage at fantastic rates, with storage devices holding tremendous amounts of data. Software development techniques had vastly improved, and 4GLs made the development process much easier and faster. Unfortunately, the emphasis on quick turnaround of applications led to a backward trend of fundamental development techniques to "get the code out" as quickly as possible. This led to reducing the emphasis on requirement and design and still persists today in many software development organizations.
Subscribe to:
Post Comments (Atom)
Post a Comment