Please wait, while we are loading the content...
Please wait, while we are loading the content...
| Content Provider | ACM Digital Library |
|---|---|
| Author | Sussman, Gerald Jay |
| Abstract | Engineers of large systems must be concerned with both design of new systems and maintenance of existing systems. A great deal of effort goes into arranging things so that systems may be maintained and extended when needed, and that new systems fit into the context of existing structures. Insight about the support of software engineering can thus be derived by examining how other branches of engineering support their version of the processes of engineering.Before embarking on a design project an engineer must first determine the customer's need. This is always a complex process, involving modeling of the customer's situation, to determine exactly what is essential and what is accidental. The specifications derived from this analysis are the input to the design process. The specifications may be expressed in varying degrees of formality. Though mathematical formalism is to be desired because it is unambiguous and because it can be manipulated precisely, existing mathematical technique is usually inadequate to precisely specify the requirements. This means that the specification phase must be part of the debugging loop.The design engineer first attempts to meet the specifications with some existing artifact. If this fails, he attempts to synthesize an artifact that meets the specifications by combining the behaviors of several parts, according to some standard plan. For example, in electrical engineering, a complex signal-processing system can often be composed as a cascade of simpler signal-processing components - an engineer knows that the transfer function of a cascade is the product of the transfer functions of the parts cascaded, if the loading is correct. Such design depends on the ability to compute the behavior of a combination of parts from their specifications and a description of how they are combined. Often such analysis must be approximate, with bugs worked out by simulation and debugging of breadboard prototypes.This design strategy is greatly enhanced by the existence of compatible families of canned parts with agreed-upon interfaces and well-defined specifications. If the family of canned parts is sufficiently universal, the interfaces sufficiently well specified, and if design rules can be formulated to isolate the designer from the details of the implementation of the parts, the family constitutes a design language. For example, TTL is a pretty good design language for relatively slow digital systems. Just as the approximations of analysis are often imperfect, the abstraction barriers of the design language are often violated, for good reasons. Thus there are the inevitable bugs. These may be found in simulation or prototyping.One key observation is that all phases of engineering involve bugs and debugging. Bugs are not evidence of moral turpitude or inadequate preparation on the part of the engineer, they are an essential aspect of effective strategy in the problem-solving process. Real-world problems are usually too complex to specify precisely, even assuming that we have adequate formal language to support such specifications. (Imagine trying to formalize what is meant by “My program plays chess at the Expert level.” or “My music synthesizer circuit makes sounds that are like a fine violin.” in any precise way, yet surely such specifications are the meat of real engineering.) And even where one can write very precise specifications (as in “This circuit makes 1/f noise.”) such specifications are often mathematically intractable in current practice. Even worse, analysis of systems made up of a few simple, precisely-specifiable components such as transistors (here a few exponential equations suffice), or floating-point additions are mathematically intractable. Thus engineers, like physicists, make progress by approximate reasoning. Linearizations are made, and deductions that ignore possibly important interactions are used to plan designs, with the explicit intention of finding the relevant ignored effects in the simulation or prototype debugging phase of design. Bugs thus arise from the deliberate oversimplification of problems inherent in using perturbational methods to develop answers to hard problems. Finally, bugs often arise, in apparently finished products, because of unanticipated changes in the requirements of the customer. Although these are technically not errors in the design, the methods we have for patching a design to accommodate a change in requirements amounts to debugging the installed system.Software engineering needs appropriate tools to support each of the phases of the engineering process. There must be tools to aid with specification and modelling, with synthesis and analysis, with rapid-prototyping and debugging, with documentation, verification, and testing, and with maintenance of finished products. In addition there must be environmental tools to support the engineering process in the large.Our tools must support the natural processes of problem solving. They must provide precise ways to talk about alternate design plans and strategies. There is no doubt that mathematical formalisms, such as logic and abstract algebra are essential ingredients in such an enterprise, but we must be careful to separate our concerns. Mathematical formalisms and technique are rarely strong enough to provide better than very approximate models of interesting physical, economic or engineered systems. Sometimes a system that is readily described as a computer program is not easily formalized in more conventional terms. This suggests the exciting possibility that some difficult theoretical constructs can be formally represented as computational algorithms. We can expect to manipulate these as we now manipulate equations of classical analysis. Such a paradigm shift is already taking place in control theory. Twenty years ago, the dominant method for making control systems was synthesizing them using feedback, and the dominant theory was concerned with the stability of linear feedback systems. In contrast, most modern control systems are microprocessor-based, and the theory is now much more qualitative. Moreover, the program for the microprocessor is often a much simpler description of the strategy of control than the classical equations, and one can thus express much more complex strategies than were previously feasible to analyze and synthesize. An even more striking revolution has occurred in the design of signal-processing systems. The nature of the field has changed completely, because digital filters are algorithms.Artificial intelligence research often uses programs as theoretical constructs, akin to equations and schematic diagrams, but with the added feature that programs that embody parts of a theory of the design of programs can be used as tools in the process of theory construction (or software development). The language Lisp, for example, was initially conceived as a theoretical vehicle recursion theory and for symbolic algebra. Most AI experiments are formulated in Lisp. Lisp has developed into a uniquely powerful and flexible family of software development tools, providing wrap-around support for the rapid-prototyping of software systems. As with other languages, Lisp provides the glue for using a vast library of canned parts, produced by members of the AI community. In Lisp, procedures are first-class data, to be passed as arguments, returned as values, and stored in data structures. This flexibility is valuable, but most importantly, it provides mechanisms for formalizing, naming, and saving the idioms - the common patterns of usage that are essential to engineering design. In addition, Lisp programs can easily manipulate the representations of Lisp programs - a feature that has encouraged the development of a vast structure of program synthesis and analysis tools, such as cross-referencers. |
| Starting Page | 397 |
| Ending Page | 399 |
| Page Count | 3 |
| File Format | |
| ISBN | 0818606207 |
| Language | English |
| Publisher | Association for Computing Machinery (ACM) |
| Publisher Date | 1985-08-01 |
| Access Restriction | Subscribed |
| Content Type | Text |
| Resource Type | Article |
National Digital Library of India (NDLI) is a virtual repository of learning resources which is not just a repository with search/browse facilities but provides a host of services for the learner community. It is sponsored and mentored by Ministry of Education, Government of India, through its National Mission on Education through Information and Communication Technology (NMEICT). Filtered and federated searching is employed to facilitate focused searching so that learners can find the right resource with least effort and in minimum time. NDLI provides user group-specific services such as Examination Preparatory for School and College students and job aspirants. Services for Researchers and general learners are also provided. NDLI is designed to hold content of any language and provides interface support for 10 most widely used Indian languages. It is built to provide support for all academic levels including researchers and life-long learners, all disciplines, all popular forms of access devices and differently-abled learners. It is designed to enable people to learn and prepare from best practices from all over the world and to facilitate researchers to perform inter-linked exploration from multiple sources. It is developed, operated and maintained from Indian Institute of Technology Kharagpur.
Learn more about this project from here.
NDLI is a conglomeration of freely available or institutionally contributed or donated or publisher managed contents. Almost all these contents are hosted and accessed from respective sources. The responsibility for authenticity, relevance, completeness, accuracy, reliability and suitability of these contents rests with the respective organization and NDLI has no responsibility or liability for these. Every effort is made to keep the NDLI portal up and running smoothly unless there are some unavoidable technical issues.
Ministry of Education, through its National Mission on Education through Information and Communication Technology (NMEICT), has sponsored and funded the National Digital Library of India (NDLI) project.
| Sl. | Authority | Responsibilities | Communication Details |
|---|---|---|---|
| 1 | Ministry of Education (GoI), Department of Higher Education |
Sanctioning Authority | https://www.education.gov.in/ict-initiatives |
| 2 | Indian Institute of Technology Kharagpur | Host Institute of the Project: The host institute of the project is responsible for providing infrastructure support and hosting the project | https://www.iitkgp.ac.in |
| 3 | National Digital Library of India Office, Indian Institute of Technology Kharagpur | The administrative and infrastructural headquarters of the project | Dr. B. Sutradhar bsutra@ndl.gov.in |
| 4 | Project PI / Joint PI | Principal Investigator and Joint Principal Investigators of the project |
Dr. B. Sutradhar bsutra@ndl.gov.in Prof. Saswat Chakrabarti will be added soon |
| 5 | Website/Portal (Helpdesk) | Queries regarding NDLI and its services | support@ndl.gov.in |
| 6 | Contents and Copyright Issues | Queries related to content curation and copyright issues | content@ndl.gov.in |
| 7 | National Digital Library of India Club (NDLI Club) | Queries related to NDLI Club formation, support, user awareness program, seminar/symposium, collaboration, social media, promotion, and outreach | clubsupport@ndl.gov.in |
| 8 | Digital Preservation Centre (DPC) | Assistance with digitizing and archiving copyright-free printed books | dpc@ndl.gov.in |
| 9 | IDR Setup or Support | Queries related to establishment and support of Institutional Digital Repository (IDR) and IDR workshops | idr@ndl.gov.in |
|
Loading...
|