Wireless Intelligent Networks (WINs)

 Wireless telecommunications can be divided into two broad categories: mobile communications and fixed wireless communications. The mobile communications market requires mobility or non-tethered communications. The goal of mobility is anytime, anywhere communications. Mobile communications technology must be able to allow roaming - the ability to provide service to a mobile phone users while outside their home system. On the other hand, fixed wireless is simply an alternative to wired communications.

 The intelligent network (IN) is an architectural concept that enables the real-time execution of network services and customer applications in a distributed environment consisting of interconnected computers and switching systems. Thus the operator services are automated in INs. These automated services provide directory assistance and is used to provide telephone numbers, addressing information, and so on to the customer. While there are various proprietary-based mobile intelligent network (IN) technologies, the standards based technologies are often of most value to the mobile network operator and their customers. These standards based technologies are referred to as Wireless Intelligent Network (WIN). Wireless Intelligent network is developed to drive intelligent network capabilities such as service independence, separation of basic switching functions from service and application functions and independence of applications from lower-level communication details into wireless networks. The primary weapon for empowering providers to deliver distinctive services with enhanced flexibility is Wireless Intelligent Networks (WINs).


Download  >>

Wideband OFDMA.

    Orthogonal frequency Division Multiplexing (OFDM) is multicarrier transmission technique. OFDM is a communication technique that divides the communication channel into a number of equally spaced frequency bands. A sub carriers a portion of the users information in each band. Each sub carriers is orthogonal (Independent of each other) with every other sub carrier. OFDM efficiently squeezes multiple modulated carriers tightly together reducing the required bandwidth.

    OFDM was invested in 1960’s, only recently it has recognized as an excellent method for bi-directional wireless data communication. It is extremely efficient in mitigating common problems in high-speed communication such as multipath fading and RF noise interference. It can be considered as multiple access technique OFDMA.


Download >>

Wearable Computing


            The concept of wearable computing was first brought forward by Steve Mann, who, with his invention of the ‘WearComp’ in 1979 created a pioneering effort in wearable computing. Other than being a portable computer, a wearable computer must be an adaptive system with an independent processor. That is the system must adapt to the whims and fancies of the user instead of the user having to adapt his lifestyle for the system.

The six devices to be introduced represent the new frontiers in the development of wearable technology. They are:

Nomad – Wearable Audio Computing
DyPERS – Dynamic Personal Enhanced Reality System
Wearable Cinema
Affective Computers
FAST – Wearable Computing for Factory Personnel
Computerized Clothing


Download >>

WISENET


                 WISENET is a wireless sensor network that monitors the environmental conditions such as light, temperature, and humidity. This network is comprised of nodes called “motes” that form an ad-hoc network to transmit this data to a computer that function as a server. The server stores the data in a database where it can later be retrieved and analyzed via a web-based interface. The network works successfully with an implementation of one sensor mote.

Download >>

Cruise Control Devices

The concept of assisting driver in the task of longitudinal vehicle control is known as cruise control. Starting from the cruise control devices of the seventies and eighties, now the technology has reached cooperative adaptive cruise control. This paper will address the basic concept of adaptive cruise control and the requirement to realize its improved versions including stop and go adaptive cruise control and cooperative adaptive cruise control. The conventional cruise control was capable only to maintain a set speed by accelerating or decelerating the vehicle. Adaptive cruise control devices are capable of assisting the driver to keep a safe distance from the preceding vehicle by controlling the engine throttle and brake according to the sensor data about the vehicle. Most of the systems use RADAR as the sensor .a few use LIDAR also. This paper includes a brief theory of pulse Doppler radar and FM-CW LIDAR used as sensors and the basic concept of the controller.
CONTENTS
•  PRINCIPLE OF ACC
•  SENSOR OPTIONS
•  SPACE OF MANEUVERABILITY AND STOPPING DISTANCE:
•  CONTROLLER
•  CO OPERATIVE ADAPTIVE CRUISE CONTROL [CACC]
•  ADVANTAGES AND DISADVANTAGES
•  CONCLUSION & REFERENCES 


Download >>

Agent Oriented Programing

Agent-Oriented Software Engineering is the one of the most recent contributions to the field of Software Engineering. It has several benefits compared to existing development approaches, in particular the ability to let agents represent high-level abstractions of active entities in a software system. This paper gives an overview of recent research and industrial applications of both general high-level methodologies and on more specific design methodologies for industry-strength software engineering.

We know that people/systems depend on other people/systems to accomplish tasks or goals, people/systems make commitments to provide a task or meet a goal people/systems have strategies to ensure their goals are accomplished. Agent-oriented approaches model people and systems as agents.

Agent oriented programming is an emerging programming paradigm with roots in the domain of artificial intelligence. This paradigm is often described to as the natural successor to object oriented paradigm. Highly suited to applications which are embedded in complex dynamic environments, it is based on human concepts such as belief, goals and plans. This allows a natural specification of sophisticated software systems in terms that are similar to human understanding, allowing programmers to concentrate on the critical properties of the application rather than getting absorbed in the intricacies of complicated environment

Download Report >>

Boiler Instrumentation and Controls

Instrumentation and controls in a boiler plant encompass an enormous range of equipment from simple in the small industrial plant to the complex in the large utility station. B I C is the control over the industrial boilers. It consists of several control loops to control various systems related to a boiler. The main control of boilers include combination control and feedwater control. To do the various operations in control different hardware methods are used.

Virtually any boiler-old or new, industrial or utility can benefit from or several control system modifications available today either by introducing advanced control schemes adding to existing control schemes

Download Report >>

Symbian

Mobile phones have become part and parcel of our life. Today's mobile phones not only offer traditional voice services but also other applications like SMS, MMS, internet, video player, radio etc. All these advancements is made possible by using Symbian OS in mobiles. Just like PCs have an operating system like Windows, Symbian is the O.S for mobile phones. But unlike PC design, mobile phone put constrains on a suitable O.S. The operating system has to have a low memory footprint, and low dynamic memory usage, and efficient power management framework and real time support for communication and telephony protocols. Symbian OS is designed for the mobile phone environment .It addresses constraints of mobile phones by providing a framework to handle low memory situations, a power management, and a rich software layer implementing industry standards for communication, telephony and data rendering. Symbian OS development was the result of extensive research carried out by world's leading mobile phone companies like Nokia, Motorola, Sony Ericssion. Today two versions, Symbian OS v7 and v6 has evolved and extensively used . With improved picture resolution and flexible user interfaces, today's mobile phones are capable of replacing PDAs and even the Palmtops.

Download Report >>

CDMA Wireless Data Transmitter

Machine health monitoring is of critical importance to its safe and reliable operation. The emergence of wireless data transmission techniques has extended the Scope of health monitoring to machine components and systems that are difficult to access or not suited for wired sensor data acquisition .this paper presents the design and prototype realization of a digital wireless data transmitter based on the CDMA technique.

The design provides a generic solution for various applications in electronic instrumentation measurement and embedded sensing for machine condition monitoring and health diagnosis.

Download Report >>

Dense Wavelength Division Multiplexing

The technology of combining a number of optical wavelengths and then transmitting the same through a single fibre is called wavelength division multiplexing (WDM). Conceptually, the technology is similar to that of frequency division multiplexing (FDM) used in analogue transmission. Dense wavelength division multiplexing (DWDM) is anew born multiplexing technology in the fibre optic transmission, bringing about a revolution in the bit rate carrying capacity over a single fibre. The emergent of DWDM system is one of the important phenomena in development of optic fibre transmission. This article gives introduction of DWDM technology.

Download Report >>

Cruose Processor

Mobile computing has been the buzzword for quite a long time. Mobile computing devices like laptops, webslates & notebook PCs are becoming common nowadays. The heart of every PC whether a desktop or mobile PC is the microprocessor. Several microprocessors are available in the market for desktop PCs from companies like Intel, AMD, Cyrix etc.The mobile computing market has never had a microprocessor specifically designed for it. The microprocessors used in mobile PCs are optimized versions of the desktop PC microprocessor. Mobile computing makes very different demands on processors than desktop computing, yet up until now, mobile x86 platforms have simply made do with the same old processors originally designed for desktops. Those processors consume lots of power, and they get very hot. When you're on the go, a power-hungry processor means you have to pay a price: run out of power before you've finished, run more slowly and lose application performance, or run through the airport with pounds of extra batteries.

hot processor also needs fans to cool it; making the resulting mobile computer bigger, clunkier and noisier. A newly designed microprocessor with low power consumption will still be rejected by the market if the performance is poor. So any attempt in this regard must have a proper ' performance-power ' balance to ensure commercial success. A newly designed microprocessor must be fully x86 compatible that is they should run x86 applications just like conventional x86 microprocessors since most of the presently available software's have been designed to work on x86 platform.

Download Report >>

Delay Tolerant Networking

Increasingly, network applications must communicate with counterparts across disparate networking environments characterized by significantly different sets of physical and operational constraints; wide variations in transmission latency are particularly troublesome. The proposed Interplanetary Internet which must encompass both terrestrial and interplanetary links is an extreme case. An architecture based on a “least common denominator” protocol that can operate successfully and (where required) reliably in multiple disparate environments would simplify the development and deployment of such applications. The Internet protocols are ill suited for this purpose .The three fundamental principles that would underlie a delay-tolerant networking (DTN) architecture and the main structural elements of that architecture, centered on a new end-to-end over lay network protocol called bundling are examined here. The Internet infrastructure adaptations that might yield comparable performance are also examined but it is seen that the simplicity of the DTN architecture promises easier deployment and extension.
Download Report >>

Cellular Digital Packet Data (CDPD)

Cellular Digital Packet Data (CDPD) is a specification for supporting wireless access to the internet and other public data networks. CDPD transmits digital packet data at 19.2 Kbps, using idle times between cellular voice calls on the cellular telephone network. CDPD technology represent a way for law enforcement agencies to improve how they manage their communications and information system.

CDPD technology represent a way for law enforcement agencies to improve how they manage their communications and information systems data transmitted on the CDPD systems travel several times faster than data send using analog networks.

CDPD is an overlay to the existing cellular network, which enables users to transmit packets of data over the cellular network using a portable computing device and a CDPD modem. CDPD offers a high-speed, high-capacity, low-cost system with the greatest possible coverage. Additionally data is encrypted for security. CDPD air link transmissions have a 19,200 bps raw data rate. As a tool for transmitting data CDPD utilizes digital networks.

he Raven is a rugged, full duplex Cellular Digital Packet Data (CDPD) modem that provides wireless transport capabilities for fixed and mobile applications. The Raven is an efficient and secure wireless packet data technology that is ideal for un-tethered applications.

Download Report >>

Biometric Technology

BIOMETRICS refers to the automatic identification of a person based on his or her physiological or behavioral characteristics like fingerprint, or iris pattern, or some aspects of behaviour like handwriting or keystroke patterns. Biometrics is being applied both to identity verification. The problem each involves is somewhat different. Verification requires the person being identified to lay claim to an identity. So the system has two choices, either accepting or rejecting the person's claim. Recognition requires the system to look through many stored sets of characteristics and pick the one that matches the unknown individual being presented. BIOMETRIC system is essentially a pattern recognition system, which makes a personal identification by determining the authenticity of a specific physiological or behavioral characteristics possessed by the user.

Biometrics is a rapidly evolving technology, which is being used in forensics Such as criminal identification and prison security, and has the potential to be used in a large range of civilian application areas. Biometrics can be used transactions conducted via telephone and Internet (electronic commerce and electronic banking. In automobiles, biometrics can replace keys with key-less entry devices

Download Report >>

Embedded Linux

Although the term "Embedded Linux" is only a couple of years old, Embedded Linux has already established itself as one of the most important technologies to enter the embedded computing market. The power, reliability, flexibility, and scalability of Linux, combined with its support for a multitude of microprocessor architectures, hardware devices, graphics support, and communications protocols have established Linux as an increasingly popular software platform for a vast array of projects and products. Use of Linux spans the full spectrum of computing applications, from IBM's tiny Linux wrist watch to hand-held devices. and consumer entertainment systems, to Internet appliances, thin clients, firewalls, equipment, . . . and even to cluster-based supercomputers.

If you could travel back in time to the Embedded Systems Conference of September 1999, you would find that the "Embedded Linux Market" simply did not exist, one short year ago. Sure, a growing number of developers and a handful of companies were starting to embed Linux. But as a market that anyone tracked, or paid attention to, Embedded Linux simply hadn't made it onto the radar screens.One year ago, embedding Linux was a relatively rare phenomenon and was mostly the result of developer innovation -- not the fruits of marketing plans and promotional strategies.

"Embedded Linux" has now become a disruptive force in the market.The open availability of source, coupled with today's unheralded ease and speed of collaboration and communication, turned out to be compelling factors that enabled developers to quickly and efficiently adapt to the challenges of rapidly changing landscape. So Linux began to spread like wildfire in the embedded market.

Download Report >>

Extreme programming

Extreme programming (XP) is a lightweight methodology for small-to-medium sized teams developing software in the face of vague or rapidly changing requirements. XP is a deliberate and disciplined approach to software development. XP stresses consumer satisfaction. An extreme attempt to dramatically simplify the process of developing software systems is made focusing on what delivers value: the requirements for the system or the code that implements the system. Requirement specification in the form of User Stories, code development by pairs of developers (Pair Programming), simplification of the code through Refactoring and careful repeated testing are the outstanding features of extreme Programming technique. XP improves a software project in four essential ways; communication, simplicity, feedback, and courage. XP has rejuvenated the notion of evolutionary design with practices that allow evolution to become a viable design strategy.

Download Report >>

Extreme Ultraviolet Lithography

Silicon has been the heart of the world's technology boom for nearly half a century. Each year, manufacturers bring out the next great computer chip that boosts computing power and allows our Personal Computers to do more than we imagined just a decade ago. The current technology used to make microprocessors, deep ultraviolet lithography will begin to reach its limit around 2005. At that time, chipmakers will have to look to other technologies to cram more transistors onto silicon to create powerful chips. Many are already looking at extreme-ultraviolet lithography (EUVL) as a way to extend the life of silicon at least until the end of the decade.

kin to photography, lithography is used to print circuits onto microchips Extreme Ultraviolet Lithography (EUVL) will open a new chapter in semiconductor technology. In the race to provide the Next Generation Lithography (NGL) for faster, more efficient computer chips, EUV Lithography is the clear frontrunner. Here we discusses the basic concepts and current state of development of EUV lithography (EUVL), a relatively new form of lithography that uses extreme ultraviolet (EUV) radiation with a wavelength in the range of 10 to 14 nanometers (nm) to carry out projection imaging. EUVL is one technology vying to become the successor to optical lithography.

Download Report >>

Optical Computing Technology

Optical computing means performing computations, operations, storage and transmission of data using light. Instead of silicon chips optical computer uses organic polymers like phthalocyanine and polydiacetylene.Optical technology promises massive upgrades in the efficiency and speed of computers, as well as significant shrinkage in their size and cost. An optical desktop computer is capable of processing data up to 1,00,000 times faster than current models.

Download Report >>

DSP processors

Architecture of DSP processors have become a major topic ever since the development of digital signal processing. Digital signal processing is finding wider application in almost all the fields .So does the efforts to develop a faster digital signal processor, fast DSP processor requires more sophisticated architecture. Two main companies in this field are Texas Instruments and Analog Devices. Basic architecture of DSP processors by these companies are discussed here.

. . . . The major components can is illustrated below:

. . . 1. Fast and flexible arithmetic units 2. Extended dynamic registers 3. Single cycle fetch of two operands 4. Hard ware circular buffers 5. Zero over head looping and branching

. . . The DSP processors discussed below has modified Harvard architecture, features minimized power consumption and a high degree of parallelism.

Digital radio broadcasting

Digital radio broadcasting is a technique which gives listeners interference free reception of high quality sound, easy to use radio, and the potential for wider listening choice through many additional station and services. This paper deals with different aspects of dab, viz the frequency bands used, modulation techniques, compression techniques, the advantages, various modes of implementation etc
Download Report

Synthetic Aperture Radar

Synthetic Aperture Radar or SAR is an imaging radar system that sends a microwave pulse to the surface of the earth and register the reflections from the earth's surface . On -board processing and compression of data obtained from the SAR is vital for image formation .The development of enabling technologies for space-borne SAR instruments have been a major focus of research and development during the last few years . At present the SAR systems provides only images and in future it will have to deliver dedicated information to each special user.

Quantum Dot Lasers

Quantum Dot Lasers can be considered as a quantum leap in the development of lasers. Quantum Dots improve basically the laser emissions. This property of Quantum Dots is well utilized for fiber optic communication, which is now the leading subject under research and development. Quantum Dots are thus very well used in applications fiber optic communication. The remaining major division of the field of quantum electronics deals with the interactions of coherent light with matter and again leads to a wide range of all-optical and optoelectronic devices. Basically Quantum Dots are made of InGaAs or simply GaAs structures. Also the possibility for extended wave length (?>1.1µm) emission from GaAs based devices is an important characteristic of Quantum Dots. The QDs are formed by an optimized growth approach of alternating sub-monolayer deposition of column III and column V, constituents for optoelectronic device fabrication. Thus there is a large energy separation between states.

Grid computing

The last decade has seen a substantial increase in commodity computer and network performance, mainly as a result of faster hardware and more sophisticated software. Nevertheless, there are still problems, in the fields of science, engineering, and business, which cannot be effectively dealt with using the current generation of supercomputers. In fact, due to their size and complexity, these problems are often very numerically and/or data intensive and consequently require a variety of heterogeneous resources that are not available on a single machine. A number of teams have conducted experimental studies on the cooperative use of geographically distributed resources unified to act as a single powerful computer. This new approach is known by several names, such as meta computing, scalable computing, global computing, Internet computing, and more recently Grid computing .

The early efforts in Grid computing started as a project to link supercomputing sites, but have now grown far beyond their original intent. In fact, many applications can benefit from the Grid infrastructure, including collaborative engineering, data exploration, high-throughput computing, and of course distributed supercomputing. Moreover, due to the rapid growth of the Internet and Web, there has been a rising interest in Web-based distributed computing, and many projects have been started and aim to exploit the Web as an infrastructure for running coarse-grained distributed and parallel applications. In this context, the Web has the capability to be a platform for parallel and collaborative work as well as a key technology to create a pervasive and ubiquitous Grid-based infrastructure.

paper aims to present the state-of-the-art of Grid computing and attempts to survey the major international efforts in developing this emerging technology

<< Download >>

Cluster Computing

A cluster is a type of parallel or distributed processing system, which consists of a collection of interconnected stand-alone computers co - operatively working together as a single , integrated computing resource.

This cluster of computers shares common network characteristics like the same namespace and it is available to other computers on the network as a single resource. These computers are linked together using high-speed network interfaces between themselves and the actual binding together of the all the individual computers in the cluster is performed by the operating system and the software used.

Download Report >>

Clockless Chips

Clock speeds are now in the gigahertz range and there is not much room for speedup before physical realities start to complicate things. With gigahertz clock powering a chip, signals barely have enough time to make it across the chip before the next clock tick. At this point, speeding up the clock frequency could become disastrous. This is where a chip that is not constricted by clock comes in to action.

Clockless approach, which uses a technique known as asynchronous logic, differs from conventional computer circuit design in that the switching on and off of digital circuits is controlled individually by specific pieces of data rather than by a tyrannical clock that forces all of the millions of the circuits on a chip to march in unison.

A major hindrance to the development of the clockless chips is the competitiveness of the computer industry. Presently, it is nearly impossible for companies to develop and manufacture a Clockless chip while keeping the cost reasonable. Another problem is that there aren't much tools used to develop asynchronous chips. Until this is possible, Clockless chips will not be a major player in the market.

In this seminar the topics covered are – general concept of asynchronous circuits, their design issues and types of design. The major designs discussed are Bounded delay method, Delay insensitive method & the Null Conventional Logic (NCL).

The seminar also does a comparison of synchronous and asynchronous circuits and the applications in which asynchronous circuits are used.

Download Report >>

Cellular Neural Networks

Is a revolutionary concept and an experimentally proven computing paradigm for analog computers. A standard CNN architecture consists of an m*n rectangular array of cells c(i,j) with Cartesian co-ordinates. Considering inputs and outputs of a cell as binary arguments. It can realize Boolean functions. using this technology, analog computers mimic anatomy & physiology of many sensory& processing organs with stored programmability. this has been called “sensor revolution” with cheap sensors &mems arrays in desired forms of artificial eyes, ears, nose etc. Such a computer is capable of computing 3 trillion equivalent digital operations/sec, a performance that can be only matched by super computers. CNN chips are mainly used in processing brain-like tasks due to its unique architecture which are non-numeric &spatio temporal in nature and will require no more than accuracy of common neurons.

Download Report >>

Differentiated Services

An increasing demand for Quality of Service on the Internet has led to various developments in that area. Differentiated Services is a technique to provide such Quality of Service in an efficient and scalable way.

Management of computer networks involves both monitoring of running services as well as the configuration of those services. On the Internet, the SNMP protocol is used to retrieve and set variables in a MIB. In order to facilitate the management of routers equipped with Differentiated Services, the IETF has created the DiffServ MIB (which is still work in progress).

This assignment involves building a prototype implementation of the DiffServ MIB using a router running the GNU/Linux operating system, using the Network Traffic Control facilities in the kernel and the net-snmp SNMP agent software.

The IETF diffserv WG is still working on the DiffServ MIB. The result of implementation work is valuable to the MIB authors as it may help in improving the MIB specification. Therefore anyresults should be reported back to the IETF

Download Report >>

Compact Peripheral Component Interconnect

Compact peripheral component interconnect (CPCI) is an adaptation of the peripheral component interconnect (PCI) specification for industrial computer applications requiring a smaller, more robust mechanical form factor than the one defined for the desktop. CompactPCI is an open standard supported by the PCI Industrial Computer Manufacturer's Group (PICMG). CompactPCI is best suited for small, high-speed industrial computing applications where transfers occur between a number of high-speed cards.

CompactPCI is a high-performance industrial bus that uses the Eurocard form factor and is fully compatible with the Enterprise Computer Telephony Forum (ECTF) computer telephony (CT) Bus™ H.110 standard specification. CompactPCI products make it possible for original equipment manufacturers (OEM), integrators, and resellers to build powerful and cost-effective solutions for telco networks, while using fewer development resources.

Download Report >>


Autonomic Computing

“Autonomic Computing” is a new vision of computing initiated by IBM. This new paradigm shifts the fundamental definition of the technology age from one of computing, to one defined by data. Access to data from multiple, distributed sources, in addition to traditional centralized storage devices will allow users to transparently access information when and where they need it. At the same time, this new view of computing will necessitate changing the industry's focus on processing speed and storage to one of developing distributed networks that are largely self-managing, self-diagnostic, and transparent to the user.

The high-tech industry has spent decades creating computer systems with ever-mounting degrees of complexity to solve a wide variety of business problems. Ironically, complexity itself has become part of the problem. It's a problem that's not going away, but will grow exponentially, just as our dependence on technology has.

The solution may lie in automation, or creating a new capacity where important computing operations can run without the need for human intervention. In Autonomic Computing we build computer systems that regulate themselves much in the same way our nervous systems regulates and protects our bodies.

Download Report >>

Augmented reality

Video games have been entertaining us for nearly 30 years, ever since Pong was introduced to arcades in the early II 970's.Computer graphics have become much more sophisticated since then, and soon, game graphics will seem all too real. In the next decade, researchers plan to pull graphics out of your television screen or computer display and integrate them into real- world environments. This new technology called augmented reality, will further blur the line between what is real and what is computer-generated by enhancing what we see, hear, feel and smell.

Augmented reality will truly change the way we view the world. Picture yourself walking or driving down the street. With augmented-reality displays, which will eventually look much like a normal pair of glasses, informative graphics will appear in your field of view, and audio will coincide with what ever you see. These enhancements will be refreshed continually to reflect the moments of your head.

Augmented reality is still in the early stage of research and development at various universities and high-tech companies. Eventually, possibly by the end of this decade we will see the first mass-marketed augmented-reality system, which can be described as “the Walkman of the 21st Century”.

Download Report >>


Aeronautical Communications

In the future, airliners will provide a variety of entertainment and communications equipment to the passenger. Since people are becoming more and more used to their own communications equipment, such as mobile phones and laptops with Internet connection, either through a network interface card or dial-in access through modems, business travellers will soon be demanding wireless access to communication services. Specifically it focus on wireless services such as UMTS and W-LAN in aircraft cabins that connect the passenger via satellite to terrestrial infrastructure. In an aeronautical scenario global coverage is essential for providing continuous service. Therefore satellite communication became indispensable, and together with ever increasing data rate requirements of applications, aeronautical satellite communication meets an expensive market.Certain features of UMTS and W-LAN that helps to provide these services are also explained.


Download Report >>

Boiler Instrumentation and Controls

Instrumentation and controls in a boiler plant encompass an enormous range of equipment from simple in the small industrial plant to the complex in the large utility station. B I C is the control over the industrial boilers. It consists of several control loops to control various systems related to a boiler. The main control of boilers include combination control and feedwater control. To do the various operations in control different hardware methods are used.

Virtually any boiler-old or new, industrial or utility can benefit from or several control system modifications available today either by introducing advanced control schemes adding to existing control schemes


Download Report >>

BiCMOS Technology

The need for high-performance, low-power, and low-cost systems for network transport and wireless communications is driving silicon technology toward higher speed, higher integration, and more functionality. Further more, this integration of RF and analog mixed-signal circuits into high-performance digital signal-processing (DSP) systems must be done with minimum cost overhead to be commercially viable. While some analog and RF designs have been attempted in mainstream digital-only complimentary metal-oxide semiconductor (CMOS) technologies, almost all designs that require stringent RF performance use bipolar or semiconductor technology. Silicon integrated circuit (IC) products that, at present, require modern bipolar or BiCMOS silicon technology in wired application space include the essential optical network (SONET) and synchronous digital hierarchy (SDH) operating at 10 Gb/s and higher.

The viability of a mixed digital/analog. RF chip depends on the cost of making the silicon with the required elements; in practice, it must approximate the cost of the CMOS wafer, Cycle times for processing the wafer should not significantly exceed cycle times for a digital CMOS wafer. Yields of the SOC chip must be similar to those of a multi-chip implementation. Much of this article will examine process techniques that achieve the objectives of low cost, rapid cycle time, and solid yield.

Download Report >>