The status of applications software: late




If you've ever set out to accomplish a particular task on your PC only to find there was no software that could do it, you've experienced software lag. It's a frustrating feeling—knowing that your computer is capable of doing what you need but is prevented from doing so by the lack of the right software. You've been cheated. The computer that once promised so muchnow has so little to offer.

The root of the problem, is forked. Neither IBM nor Microsoft has provided a 32-bit DOS-compatible operating system, and developers are still learning how to cope with many megabytes of data. As a result, the current crop of applications software often relies on brute force to get things done.

Not everything, however, is bad in the software world. In fact, there is evidence that applications software is headed for a common user interface, and that WYSIWYG may become a way of life. And programs may even be getting smarter.

Although you don't need a crystal ball to predict that new changes in software are coming, exactly what the changes will be is less clear. But you can identify some of the forces driving the changes. The one thing that is certain is that users know what they want.

Of course, not all the fault for the software lag belongs to applications developers. They're missing an operating system designed specifically for 80386-based hardware. Although OS/2 happens to work on 80386 systems, it was not designed for them. It's a 16-bit operating system for 80286 machines.

On the other hand, developers have yet to conquer OS/2. Even the grandest application of them all—Lotus 1-2-3 release 3.0, which took years to produce - was designed for DOS 3.x. You'd be hard-pressed to walk into any computer store and find five OS/2 applications sitting on the shelf. A lot of software companies talk about OS/2 applications, but few have actually produced any.

The reasons offered are many, but it all boils down to a matter of investment. While OS/2's complexities, such as multitasking and data sharing, ultimately offer more headroom for sophisticated programs, its learning curve for developers is more like a brick wall.

Even the software giants such as Lotus, Ashton-Tate, and Microsoft, with their abundant resources, have ex-perienced setbacks. Just consider the long waits for 1-2-3 release 3.0, dBASE IV, and a full-featured Windows word processor. And those are just DOS-based applications. The point is that, even for these companies with their millions of R&D dollars, the number of labor hours needed to develop sophisticated applications is gargantuah.

To complicate matters further, increased storage capacities have offerednew op-portunities and challenges for applications developers. While more storage would seem obviously better, not every-one is certain how best to use the hundreds of megabytes that optical drives provide.

For now, publishers are using CD-ROMs to provide static reference materials. Notable examples are Grolier's Electronic Encyclopedia and Microsoft's Programmer's Library. But what most users really need is for their applications to manage dynamic archiving.

Currently, when your hard disk becomes nearly full, you have to remove your older files. Maybe you archive them on floppy disks. If you do, chances are that you don't bother referring to those files again because it's too much trouble:You would have to fumble through all your archive disks, trying one and then, another, to find a certain bit of information. You might even find it easier and faster to search through printed reports in a file cabinet.

That's one of the ironies of today's applications software. Although most of the modern world is convinced that you can do record keeping and manage things better on personal computers, you still have to resort to a file cabinet and Pendaflex folders to see your old records.

A better arrangement would be applications software that really takes advantage of read/write or WORM (write once, read many times) optical disks. Such software would, on a regular basis, archive your old records and files on optical disks. More important, the application program would manage those archives. It would continually update its indexes so that, say, five years from now, on a moment's notice, you could call up the spreadsheet for October 1989's production costs. If you needed to change optical disks, the program would fell you which one to insert.

Also, your application should be able to use that archived information. It should be able to correlate it with more recent information to generate comparative reports and to project the next year's performance.

Unfortunately, that kind of software does not exist today, even though the hardware to handle such tasks exists. The fact is, software for dealing with large amounts of on-line data is just emerging. Consider Lotus Magellan and Traveling Software's ViewLink, for example. They are the first major attempts to help you actively manage several megabytes of disparate information. Either will let you peer into data files on your hard disk and view the data in its native format. Both will also search your hard disk for the file or files containing specific information.

But while Magellan and ViewLink work fine as utilities for managing what's currently on your hard disk, they're really no help at managing archives on floppy disks. Both would also fall short in handling a gigabyte or more of data on optical disks. Even worse, both of these programs create a whole new set of problems. Magellan takes up valuable hard disk space with its index, and it needs to update the index frequently, sometimes taking several minutes to do that. And because ViewLink doesn't use an index, its searches can take a long time if you're working with a large disk with lots of data. Equally as bad, there are no Magellan or ViewLink equivalents for Windows or Presentation Manager (PM).

 

Text 4.

Control at a distance

In a very real sense, the Internet has changed the way we think about information and exchange of resources. Now engineers are using the Internet and software applications to remotely monitor and perform distributed execution of test and control applications. Such an approach reduces the time and cost involved in tests by sharing optoelectronics instrumentation and by distributing tasks to optimal locations.

A typical automated test and control system uses a computer to control positioning equipment and instrumentation. We'll use the term "remote control" to refer to the technique of enabling an outside computer to connect to an experiment and control that experiment from a distance. Such an approach benefits engineers who need to monitor applications running in harsh environments that offer them limited access, or for tests whose long durations are impractical for continuous human monitoring.

In addition, remote control offers engineers the ability to change test parameters at certain intervals without traveling to the site or even running from their office into another area of the building. This convenience allows a test operator to view results and make test modifications from home on the weekend, for example. The user simply logs on to the network from home, connects to the application, and makes those changes just as though he or she were on site.

To effectively control applications via the Internet, companies are developing software programs that champion remote execution. For instance, Lab VIEW (National Instruments; Austin, TX) allows users to configure many software applications for remote control through a common Web browser simply by pointing the browser to a Web page associated with the application. Without any additional programming, the remote user can access fully the user interface that appears in the browser. The acquisition still occurs on the host computer, but the remote user has complete control of the process and can view acquired data in real time. Other users also can point their browser to the same URL to view the test.

Windows XP makes it easier to control applications via the Internet. With this Microsoft OS, users now get Remote Desktop and Remote Assistance, which offer tools for debugging deployed systems. After a system is deployed in the field, it is often cost-prohibitive for the support staff to visit every site. With Remote Desktop, a support operator can log in to a remote Windows XP machine and work as if he or she were sitting at the desk where that machine is located. With Remote Assistance, the onsite operator can remain in control of the desktop but the support operator can view the desktop on his or her remote machine. At any time, the onsite operator can give up control of the desktop to the support operator and still monitor which troubleshooting techniques are in use. Industry-standard software development tools take advantage of these new features.

At times, it may be desirable to use the Web browser to initiate a measurement or automation application but not actually control the experiment."; In this case, the remote operator can log in, set certain parameters, and run the application over a common gateway interface (CGI). With CGI, the user communicates with a server-side program or script run by an HTTP server in response to an HTTP request from a Web browser. This program normally builds HTML dynamically by accessing other data sources such as a database. As part of the HTTP request, the browser can send to the server the parameters to use in running the application.

In classical remote control, one person or machine at a time is charged with controlling the experiment. In distributed execution, however, a user can truly take advantage of the benefits of networking, extending control to an entire remote system connected on the same network. In this way, individual machines focus on specific functions, and each system is optimized to perform its chosen task. Because data can be shared among the distributed components and each component accomplishes a unique task, this network functions as a complete system. For instance, it is possible to dedicate certain machines for acquisition and control while relegating analysis and presentation to other systems. Technology makes it possible to remotely monitor, control, and even run diagnostics while the system itself is dedicated to running acquisition and control, introducing the ability to multitask.

Certain test and control applications require an embedded, reliable solution. For these applications, the user can download the software to a headless, embedded controller to connect and control remotely. The controller can be a single unit or a series of form factors (such as the FieldPoint module that is able to perform monitoring and control tasks in harsh environments). In either case, software runs on a real-time operating system, but it can be accessed from a host computer using an Ethernet connection.

For example, consider a structural test system measuring the vibration and harmonics of a bridge design. It is possible to set up one node with a camera to monitor the testing of the bridge, then set up another node to measure parameters such as temperature, humidity, and wind direction and speed. Finally, one can set up a node to measure the load, strain, and displacement on certain areas of the bridge. The system can send all the data back to a main computer that correlates the data, analyzes it, and displays the results of the test on a Web page.

Each of these nodes would need to be running autonomously, acquiring data and sending it onto other computers to correlate the data and create reports. With the right software and hardware, each measurement node becomes an embedded, reliable, and durable solution. The user could easily control any of the measurement nodes to modify parameters of the test. In some systems, the origin of the test and the code is completed using a Windows operating system and then downloaded to the measurement node. This enables the user to make major modifications to the test and download them to the embedded target without visiting the site.

Next, one of the live data-sharing techniques could be used to transfer the data to another cluster of computers that would correlate and analyze the data. Finally, an Internet server could allow project members to share the Web reports and analysis in geographically separated locations.

 

Text 5.

Data sharing

A key to accomplishing remote control and distributed execution is the data-sharing ability inherent to the Web. With new software programs, live data sharing can be as easy as simply right-clicking the item and placing a checkmark in a checkbox, which saves time for users and allows them to take advantage of the Web economies of scale such as efficient data transfer from one computer to another and the ability to access data in real time. Applications must also afford users real-time access to acquired data to control or monitor a process or perform a test across a network.

Sharing data leads to convenience—users can be remote while control applications are running, and contact methods can extend to mobile phones or pagers. For example, certain software programs allow users to send e-mail alerts. Electronic notifications can be created that allow operators to receive alerts from the production area via mobile phones or pagers when certain process values exceed established limits; at that point, the operator can log on to control the application. Such updates generated automatically during the testing process free up operator time to be spent on more productive tasks. As an example, this technique would be useful for a small company running burn-in tests, which can take six to 10 hours. With the type of system described above, the engineer could go back to his or her desk and receive an alert if test results don't fall within set test parameters.

With distributed execution tasks, the network enables users to access various measurement nodes. It is possible to develop software that uses each computer to complete a portion of the application; a test could have several acquisition nodes, each sharing data with the main computer or cluster of computers that perform the analysis, generate reports, and send them to the Web.

For data sharing, extensible markup language (XML, which enables definition, transmission, validation, and interpretation of data between applications and organizations), is quickly becoming a standard way totransfer data in a text-readable fashion that can easily be displayed on the Web. Because of the universal XML standard, one can generate a Web report featuring a defined data set and easily import it into other applications. Because the data is readily accessible, applications can download any XML document, parse the data, and perform custom analyses. Some software applications now include built-in functions for creating or reading XML documents.

Manufacturers have realized the enormous cost benefits of using common off-the-shelf, Internet-related hardware and software components to communicate process data. The same technology used for Internet applications can also be used to connect the enterprise. On the plant floor, data acquisition and automation systems serve as information-access points to the larger corporate IT systems. Data can be transported using existing, widely accepted protocols to guarantee not only interconnectivity but also interoperability. The workforce is already trained to fetch and use data supplied through a browser.

NI’s DataSocket provides another method of sharing data directly with other parts of an organization. DataSocket implementation requires no extra development time—it streams the data in a graph or other user interface item over the network. Because DataSocket also is implemented as an ActiveX control, a Java Bean, and a component of Measurement Studio for C/C++ and Visual Basic development, users can incorporate the technology in many other applications. Project members who want to subscribe to the DataSocket Server item that contains the data use a URL to begin receiving data and any updates sent. With DataSocket, engineers can generate Web pages to display quality information from a manufacturing floor, changing properties of materials during an ongoing test, or even updates of the weather.

Although remotely controlling applications and distributing control via the Web has countless benefits related to operator convenience, as well as company time and cost savings, operators should also be cognizant of possible drawbacks. High amounts of traffic on the network could lead to slow updates or data transfer. The method of communication (Ethernet) is not a deterministic bus and offers no guarantee that data or execution will occur in a reliable amount of time.

Security is often a concern of Internet-related activities. If the remote system is on the same network as hundreds or millions of other users, the potential exists for possible system interference. Test and control applications should be implemented so that the network is protected by existing IT security systems. Best practices call for users to work with IT professionals to determine the best way to implement Web-based control applications without interfering with the particular IT system security.

In addition, many people could be trying to access the same application simultaneously. This requires companies to choose applications capable of handling multiple users accessing at the same time. If multiple access to an application is not possible, the users ultimately accomplish no more than they would through a single transaction.

The benefits of Web-based control far outweigh the disadvantages. Although certain hindrances may occur as a result of doing business on a network shared by millions, die advantages of convenience, cost, and time prompt software developers to investigate new ways to deal with the potential problems. For example, to avoid user confusion, software constraints can limit access so that only one client can control the application at a time, but that control can pass easily among the various clients at run-time. In addition, the host computer can take control of the application away from any of the remote clients at any time. The technique can also minimize cost by allowing service personnel to control and test remotely, for example.

The Internet is changing the way we control our applications by providing new ways to take measurements and distribute results. Many different options exist for remotely controlling applications and distributing execution. The best software programs allow users to take advantage of the power of the Web without having to become experts in any of its technologies, helping them incorporate the Internet into many different aspects of their application. This allows companies to integrate their applications easily into the existing corporate networking infrastructure so they can increase the productivity of those performing control.

 

Text 6.



Поделиться:




Поиск по сайту

©2015-2024 poisk-ru.ru
Все права принадлежать их авторам. Данный сайт не претендует на авторства, а предоставляет бесплатное использование.
Дата создания страницы: 2022-11-01 Нарушение авторских прав и Нарушение персональных данных


Поиск по сайту: