The intelligent network (IN) is an architectural concept that enables the real-time execution of network services and customer applications in a distributed environment consisting of interconnected computers and switching systems. Thus the operator services are automated in INs. These automated services provide directory assistance and is used to provide telephone numbers, addressing information, and so on to the customer. While there are various proprietary-based mobile intelligent network (IN) technologies, the standards based technologies are often of most value to the mobile network operator and their customers. These standards based technologies are referred to as Wireless Intelligent Network (WIN). Wireless Intelligent network is developed to drive intelligent network capabilities such as service independence, separation of basic switching functions from service and application functions and independence of applications from lower-level communication details into wireless networks. The primary weapon for empowering providers to deliver distinctive services with enhanced flexibility is Wireless Intelligent Networks (WINs).
Wireless Intelligent Networks (WINs)
The intelligent network (IN) is an architectural concept that enables the real-time execution of network services and customer applications in a distributed environment consisting of interconnected computers and switching systems. Thus the operator services are automated in INs. These automated services provide directory assistance and is used to provide telephone numbers, addressing information, and so on to the customer. While there are various proprietary-based mobile intelligent network (IN) technologies, the standards based technologies are often of most value to the mobile network operator and their customers. These standards based technologies are referred to as Wireless Intelligent Network (WIN). Wireless Intelligent network is developed to drive intelligent network capabilities such as service independence, separation of basic switching functions from service and application functions and independence of applications from lower-level communication details into wireless networks. The primary weapon for empowering providers to deliver distinctive services with enhanced flexibility is Wireless Intelligent Networks (WINs).
Wideband OFDMA.
OFDM was invested in 1960’s, only recently it has recognized as an excellent method for bi-directional wireless data communication. It is extremely efficient in mitigating common problems in high-speed communication such as multipath fading and RF noise interference. It can be considered as multiple access technique OFDMA.
Wearable Computing
The six devices to be introduced represent the new frontiers in the development of wearable technology. They are:
Nomad – Wearable Audio Computing
DyPERS – Dynamic Personal Enhanced Reality System
Wearable Cinema
Affective Computers
FAST – Wearable Computing for Factory Personnel
Computerized Clothing
WISENET
Cruise Control Devices
The concept of assisting driver in the task of longitudinal vehicle control is known as cruise control. Starting from the cruise control devices of the seventies and eighties, now the technology has reached cooperative adaptive cruise control. This paper will address the basic concept of adaptive cruise control and the requirement to realize its improved versions including stop and go adaptive cruise control and cooperative adaptive cruise control. The conventional cruise control was capable only to maintain a set speed by accelerating or decelerating the vehicle. Adaptive cruise control devices are capable of assisting the driver to keep a safe distance from the preceding vehicle by controlling the engine throttle and brake according to the sensor data about the vehicle. Most of the systems use RADAR as the sensor .a few use LIDAR also. This paper includes a brief theory of pulse Doppler radar and FM-CW LIDAR used as sensors and the basic concept of the controller. | |
CONTENTS • PRINCIPLE OF ACC • SENSOR OPTIONS • SPACE OF MANEUVERABILITY AND STOPPING DISTANCE: • CONTROLLER • CO OPERATIVE ADAPTIVE CRUISE CONTROL [CACC] • ADVANTAGES AND DISADVANTAGES • CONCLUSION & REFERENCES Download >> |
Agent Oriented Programing
Agent-Oriented Software Engineering is the one of the most recent contributions to the field of Software Engineering. It has several benefits compared to existing development approaches, in particular the ability to let agents represent high-level abstractions of active entities in a software system. This paper gives an overview of recent research and industrial applications of both general high-level methodologies and on more specific design methodologies for industry-strength software engineering.
We know that people/systems depend on other people/systems to accomplish tasks or goals, people/systems make commitments to provide a task or meet a goal people/systems have strategies to ensure their goals are accomplished. Agent-oriented approaches model people and systems as agents.
Agent oriented programming is an emerging programming paradigm with roots in the domain of artificial intelligence. This paradigm is often described to as the natural successor to object oriented paradigm. Highly suited to applications which are embedded in complex dynamic environments, it is based on human concepts such as belief, goals and plans. This allows a natural specification of sophisticated software systems in terms that are similar to human understanding, allowing programmers to concentrate on the critical properties of the application rather than getting absorbed in the intricacies of complicated environment
Download Report >>
Boiler Instrumentation and Controls
Instrumentation and controls in a boiler plant encompass an enormous range of equipment from simple in the small industrial plant to the complex in the large utility station. B I C is the control over the industrial boilers. It consists of several control loops to control various systems related to a boiler. The main control of boilers include combination control and feedwater control. To do the various operations in control different hardware methods are used.
Virtually any boiler-old or new, industrial or utility can benefit from or several control system modifications available today either by introducing advanced control schemes adding to existing control schemes
Download Report >>
Symbian
Download Report >>
CDMA Wireless Data Transmitter
Machine health monitoring is of critical importance to its safe and reliable operation. The emergence of wireless data transmission techniques has extended the Scope of health monitoring to machine components and systems that are difficult to access or not suited for wired sensor data acquisition .this paper presents the design and prototype realization of a digital wireless data transmitter based on the CDMA technique.
The design provides a generic solution for various applications in electronic instrumentation measurement and embedded sensing for machine condition monitoring and health diagnosis.
Download Report >>
Dense Wavelength Division Multiplexing
Download Report >>
Cruose Processor
Mobile computing has been the buzzword for quite a long time. Mobile computing devices like laptops, webslates & notebook PCs are becoming common nowadays. The heart of every PC whether a desktop or mobile PC is the microprocessor. Several microprocessors are available in the market for desktop PCs from companies like Intel, AMD, Cyrix etc.The mobile computing market has never had a microprocessor specifically designed for it. The microprocessors used in mobile PCs are optimized versions of the desktop PC microprocessor. Mobile computing makes very different demands on processors than desktop computing, yet up until now, mobile x86 platforms have simply made do with the same old processors originally designed for desktops. Those processors consume lots of power, and they get very hot. When you're on the go, a power-hungry processor means you have to pay a price: run out of power before you've finished, run more slowly and lose application performance, or run through the airport with pounds of extra batteries.
hot processor also needs fans to cool it; making the resulting mobile computer bigger, clunkier and noisier. A newly designed microprocessor with low power consumption will still be rejected by the market if the performance is poor. So any attempt in this regard must have a proper ' performance-power ' balance to ensure commercial success. A newly designed microprocessor must be fully x86 compatible that is they should run x86 applications just like conventional x86 microprocessors since most of the presently available software's have been designed to work on x86 platform.
Download Report >>
Delay Tolerant Networking
Download Report >>
Cellular Digital Packet Data (CDPD)
Cellular Digital Packet Data (CDPD) is a specification for supporting wireless access to the internet and other public data networks. CDPD transmits digital packet data at 19.2 Kbps, using idle times between cellular voice calls on the cellular telephone network. CDPD technology represent a way for law enforcement agencies to improve how they manage their communications and information system.
CDPD technology represent a way for law enforcement agencies to improve how they manage their communications and information systems data transmitted on the CDPD systems travel several times faster than data send using analog networks.
CDPD is an overlay to the existing cellular network, which enables users to transmit packets of data over the cellular network using a portable computing device and a CDPD modem. CDPD offers a high-speed, high-capacity, low-cost system with the greatest possible coverage. Additionally data is encrypted for security. CDPD air link transmissions have a 19,200 bps raw data rate. As a tool for transmitting data CDPD utilizes digital networks.
he Raven is a rugged, full duplex Cellular Digital Packet Data (CDPD) modem that provides wireless transport capabilities for fixed and mobile applications. The Raven is an efficient and secure wireless packet data technology that is ideal for un-tethered applications.
Download Report >>
Biometric Technology
BIOMETRICS refers to the automatic identification of a person based on his or her physiological or behavioral characteristics like fingerprint, or iris pattern, or some aspects of behaviour like handwriting or keystroke patterns. Biometrics is being applied both to identity verification. The problem each involves is somewhat different. Verification requires the person being identified to lay claim to an identity. So the system has two choices, either accepting or rejecting the person's claim. Recognition requires the system to look through many stored sets of characteristics and pick the one that matches the unknown individual being presented. BIOMETRIC system is essentially a pattern recognition system, which makes a personal identification by determining the authenticity of a specific physiological or behavioral characteristics possessed by the user.
Biometrics is a rapidly evolving technology, which is being used in forensics Such as criminal identification and prison security, and has the potential to be used in a large range of civilian application areas. Biometrics can be used transactions conducted via telephone and Internet (electronic commerce and electronic banking. In automobiles, biometrics can replace keys with key-less entry devices
Download Report >>
Embedded Linux
Although the term "Embedded Linux" is only a couple of years old, Embedded Linux has already established itself as one of the most important technologies to enter the embedded computing market. The power, reliability, flexibility, and scalability of Linux, combined with its support for a multitude of microprocessor architectures, hardware devices, graphics support, and communications protocols have established Linux as an increasingly popular software platform for a vast array of projects and products. Use of Linux spans the full spectrum of computing applications, from IBM's tiny Linux wrist watch to hand-held devices. and consumer entertainment systems, to Internet appliances, thin clients, firewalls, equipment, . . . and even to cluster-based supercomputers.
If you could travel back in time to the Embedded Systems Conference of September 1999, you would find that the "Embedded Linux Market" simply did not exist, one short year ago. Sure, a growing number of developers and a handful of companies were starting to embed Linux. But as a market that anyone tracked, or paid attention to, Embedded Linux simply hadn't made it onto the radar screens.One year ago, embedding Linux was a relatively rare phenomenon and was mostly the result of developer innovation -- not the fruits of marketing plans and promotional strategies.
"Embedded Linux" has now become a disruptive force in the market.The open availability of source, coupled with today's unheralded ease and speed of collaboration and communication, turned out to be compelling factors that enabled developers to quickly and efficiently adapt to the challenges of rapidly changing landscape. So Linux began to spread like wildfire in the embedded market.
Download Report >>
Extreme programming
Download Report >>
Extreme Ultraviolet Lithography
Silicon has been the heart of the world's technology boom for nearly half a century. Each year, manufacturers bring out the next great computer chip that boosts computing power and allows our Personal Computers to do more than we imagined just a decade ago. The current technology used to make microprocessors, deep ultraviolet lithography will begin to reach its limit around 2005. At that time, chipmakers will have to look to other technologies to cram more transistors onto silicon to create powerful chips. Many are already looking at extreme-ultraviolet lithography (EUVL) as a way to extend the life of silicon at least until the end of the decade.
kin to photography, lithography is used to print circuits onto microchips Extreme Ultraviolet Lithography (EUVL) will open a new chapter in semiconductor technology. In the race to provide the Next Generation Lithography (NGL) for faster, more efficient computer chips, EUV Lithography is the clear frontrunner. Here we discusses the basic concepts and current state of development of EUV lithography (EUVL), a relatively new form of lithography that uses extreme ultraviolet (EUV) radiation with a wavelength in the range of 10 to 14 nanometers (nm) to carry out projection imaging. EUVL is one technology vying to become the successor to optical lithography.
Download Report >>
Optical Computing Technology
DSP processors
Architecture of DSP processors have become a major topic ever since the development of digital signal processing. Digital signal processing is finding wider application in almost all the fields .So does the efforts to develop a faster digital signal processor, fast DSP processor requires more sophisticated architecture. Two main companies in this field are Texas Instruments and Analog Devices. Basic architecture of DSP processors by these companies are discussed here.
. . . . The major components can is illustrated below:
. . . 1. Fast and flexible arithmetic units 2. Extended dynamic registers 3. Single cycle fetch of two operands 4. Hard ware circular buffers 5. Zero over head looping and branching
Digital radio broadcasting
Synthetic Aperture Radar
Quantum Dot Lasers
Grid computing
The last decade has seen a substantial increase in commodity computer and network performance, mainly as a result of faster hardware and more sophisticated software. Nevertheless, there are still problems, in the fields of science, engineering, and business, which cannot be effectively dealt with using the current generation of supercomputers. In fact, due to their size and complexity, these problems are often very numerically and/or data intensive and consequently require a variety of heterogeneous resources that are not available on a single machine. A number of teams have conducted experimental studies on the cooperative use of geographically distributed resources unified to act as a single powerful computer. This new approach is known by several names, such as meta computing, scalable computing, global computing, Internet computing, and more recently Grid computing .
The early efforts in Grid computing started as a project to link supercomputing sites, but have now grown far beyond their original intent. In fact, many applications can benefit from the Grid infrastructure, including collaborative engineering, data exploration, high-throughput computing, and of course distributed supercomputing. Moreover, due to the rapid growth of the Internet and Web, there has been a rising interest in Web-based distributed computing, and many projects have been started and aim to exploit the Web as an infrastructure for running coarse-grained distributed and parallel applications. In this context, the Web has the capability to be a platform for parallel and collaborative work as well as a key technology to create a pervasive and ubiquitous Grid-based infrastructure.
paper aims to present the state-of-the-art of Grid computing and attempts to survey the major international efforts in developing this emerging technology
Cluster Computing
A cluster is a type of parallel or distributed processing system, which consists of a collection of interconnected stand-alone computers co - operatively working together as a single , integrated computing resource.
This cluster of computers shares common network characteristics like the same namespace and it is available to other computers on the network as a single resource. These computers are linked together using high-speed network interfaces between themselves and the actual binding together of the all the individual computers in the cluster is performed by the operating system and the software used.
Download Report >>
Clockless Chips
Clock speeds are now in the gigahertz range and there is not much room for speedup before physical realities start to complicate things. With gigahertz clock powering a chip, signals barely have enough time to make it across the chip before the next clock tick. At this point, speeding up the clock frequency could become disastrous. This is where a chip that is not constricted by clock comes in to action.
Clockless approach, which uses a technique known as asynchronous logic, differs from conventional computer circuit design in that the switching on and off of digital circuits is controlled individually by specific pieces of data rather than by a tyrannical clock that forces all of the millions of the circuits on a chip to march in unison.
A major hindrance to the development of the clockless chips is the competitiveness of the computer industry. Presently, it is nearly impossible for companies to develop and manufacture a Clockless chip while keeping the cost reasonable. Another problem is that there aren't much tools used to develop asynchronous chips. Until this is possible, Clockless chips will not be a major player in the market.
In this seminar the topics covered are – general concept of asynchronous circuits, their design issues and types of design. The major designs discussed are Bounded delay method, Delay insensitive method & the Null Conventional Logic (NCL).
The seminar also does a comparison of synchronous and asynchronous circuits and the applications in which asynchronous circuits are used.
Download Report >>
Cellular Neural Networks
Download Report >>
Differentiated Services
An increasing demand for Quality of Service on the Internet has led to various developments in that area. Differentiated Services is a technique to provide such Quality of Service in an efficient and scalable way.
Management of computer networks involves both monitoring of running services as well as the configuration of those services. On the Internet, the SNMP protocol is used to retrieve and set variables in a MIB. In order to facilitate the management of routers equipped with Differentiated Services, the IETF has created the DiffServ MIB (which is still work in progress).
This assignment involves building a prototype implementation of the DiffServ MIB using a router running the GNU/Linux operating system, using the Network Traffic Control facilities in the kernel and the net-snmp SNMP agent software.
The IETF diffserv WG is still working on the DiffServ MIB. The result of implementation work is valuable to the MIB authors as it may help in improving the MIB specification. Therefore anyresults should be reported back to the IETF
Download Report >>
Compact Peripheral Component Interconnect
Compact peripheral component interconnect (CPCI) is an adaptation of the peripheral component interconnect (PCI) specification for industrial computer applications requiring a smaller, more robust mechanical form factor than the one defined for the desktop. CompactPCI is an open standard supported by the PCI Industrial Computer Manufacturer's Group (PICMG). CompactPCI is best suited for small, high-speed industrial computing applications where transfers occur between a number of high-speed cards.
CompactPCI is a high-performance industrial bus that uses the Eurocard form factor and is fully compatible with the Enterprise Computer Telephony Forum (ECTF) computer telephony (CT) Bus™ H.110 standard specification. CompactPCI products make it possible for original equipment manufacturers (OEM), integrators, and resellers to build powerful and cost-effective solutions for telco networks, while using fewer development resources.
Download Report >>
Autonomic Computing
“Autonomic Computing” is a new vision of computing initiated by IBM. This new paradigm shifts the fundamental definition of the technology age from one of computing, to one defined by data. Access to data from multiple, distributed sources, in addition to traditional centralized storage devices will allow users to transparently access information when and where they need it. At the same time, this new view of computing will necessitate changing the industry's focus on processing speed and storage to one of developing distributed networks that are largely self-managing, self-diagnostic, and transparent to the user.
The high-tech industry has spent decades creating computer systems with ever-mounting degrees of complexity to solve a wide variety of business problems. Ironically, complexity itself has become part of the problem. It's a problem that's not going away, but will grow exponentially, just as our dependence on technology has.
Download Report >>
Augmented reality
Video games have been entertaining us for nearly 30 years, ever since Pong was introduced to arcades in the early II 970's.Computer graphics have become much more sophisticated since then, and soon, game graphics will seem all too real. In the next decade, researchers plan to pull graphics out of your television screen or computer display and integrate them into real- world environments. This new technology called augmented reality, will further blur the line between what is real and what is computer-generated by enhancing what we see, hear, feel and smell.
Augmented reality will truly change the way we view the world. Picture yourself walking or driving down the street. With augmented-reality displays, which will eventually look much like a normal pair of glasses, informative graphics will appear in your field of view, and audio will coincide with what ever you see. These enhancements will be refreshed continually to reflect the moments of your head.
Augmented reality is still in the early stage of research and development at various universities and high-tech companies. Eventually, possibly by the end of this decade we will see the first mass-marketed augmented-reality system, which can be described as “the Walkman of the 21st Century”.
Aeronautical Communications
Boiler Instrumentation and Controls
Instrumentation and controls in a boiler plant encompass an enormous range of equipment from simple in the small industrial plant to the complex in the large utility station. B I C is the control over the industrial boilers. It consists of several control loops to control various systems related to a boiler. The main control of boilers include combination control and feedwater control. To do the various operations in control different hardware methods are used.
Virtually any boiler-old or new, industrial or utility can benefit from or several control system modifications available today either by introducing advanced control schemes adding to existing control schemes
Download Report >>
BiCMOS Technology
The need for high-performance, low-power, and low-cost systems for network transport and wireless communications is driving silicon technology toward higher speed, higher integration, and more functionality. Further more, this integration of RF and analog mixed-signal circuits into high-performance digital signal-processing (DSP) systems must be done with minimum cost overhead to be commercially viable. While some analog and RF designs have been attempted in mainstream digital-only complimentary metal-oxide semiconductor (CMOS) technologies, almost all designs that require stringent RF performance use bipolar or semiconductor technology. Silicon integrated circuit (IC) products that, at present, require modern bipolar or BiCMOS silicon technology in wired application space include the essential optical network (SONET) and synchronous digital hierarchy (SDH) operating at 10 Gb/s and higher.
The viability of a mixed digital/analog. RF chip depends on the cost of making the silicon with the required elements; in practice, it must approximate the cost of the CMOS wafer, Cycle times for processing the wafer should not significantly exceed cycle times for a digital CMOS wafer. Yields of the SOC chip must be similar to those of a multi-chip implementation. Much of this article will examine process techniques that achieve the objectives of low cost, rapid cycle time, and solid yield.
Download Report >>