Cross Layer Design Definition

8
11F-MS-TE-02 Assignment 01 1 ASSIGNMENT # 1 MOBILE AND WIRELESS COMMUNICATION  SUBMITTED BY: LUBNA NADEEM (11F-MS-TE-02) SUBMITTED TO: RANA GHULAM SHABEER DATE: 29 th Feb, 2011 Department of Telecommunication Engineering U.E.T Taxila

Transcript of Cross Layer Design Definition

Page 1: Cross Layer Design Definition

8/2/2019 Cross Layer Design Definition

http://slidepdf.com/reader/full/cross-layer-design-definition 1/8

11F-MS-TE-02 Assignment 01

1

ASSIGNMENT # 1 MOBILE AND WIRELESS COMMUNICATION

 

SUBMITTED BY: LUBNA NADEEM

(11F-MS-TE-02)

SUBMITTED TO:

RANA GHULAM SHABEER 

DATE: 29th Feb, 2011

Department of Telecommunication Engineering U.E.T

Taxila

Page 2: Cross Layer Design Definition

8/2/2019 Cross Layer Design Definition

http://slidepdf.com/reader/full/cross-layer-design-definition 2/8

11F-MS-TE-02 Assignment 01

2

1. Cross Layer Design Definition

To fully optimize wireless broadband networks, both the challenges from the physical medium and the

QoS-demands from the applications have to be taken into account. Rate, power and coding at the

physical layer can be adapted to meet the requirements of the applications given the current channel

and network conditions. Knowledge has to be shared between (all) layers to obtain the highest possibleadaptively.

The CLD approach to network stack design is historically a big shift in how one designs a communication

system. Not only does the applications, protocols and hardware need to be re-implemented to be able

to support the new extensions, but the whole concept of CLD challenges everything engineers and

researchers know about network protocols, layers, stack design and system construction. While CLD

might seem revolutionary instead of evolutionary at first, there are implementations available today

which tries to incorporate some of the key elements of the CLD philosophy into existing protocols and

layers.

Fig. cross-layer framework for design of ad-hoc wireless networks to support delay-critical applications, such as

conversational voice or real-time video 

The Evolutionary Approach to CLD

An evolutionary approach to CLD always seeks to extend the existing layered structure, in order to

maintain compability. Note however, that there is a big difference between the evolutionary basic and

the evolutionary system-wide approach which range from

  simple, yet effective, solutions which extends parts of the strict layering structure

  to system-wide CLD where stack wide layer interdependencies are designed and implemented

to optimize overall network performance

Page 3: Cross Layer Design Definition

8/2/2019 Cross Layer Design Definition

http://slidepdf.com/reader/full/cross-layer-design-definition 3/8

11F-MS-TE-02 Assignment 01

3

Most CLDs today are evolutionary, because compability with existing systems and networks is extremely

important both for end users and commercial actors. The simple, yet effective, CLD solutions are by far

the most common today. The reason for this is quite simple: Since an evolutionary CLD is always bound

by its original strict layered structure, an extension of this will also always be limited.

The Revolutionary Approach to CLD

A revolutionary approach to CLD, or any design for that matter, is not bound by an existing

implementation, and as such does not need to compromise to maintain compability. Where an

evolutionary CLD approach prioritizes compability first and performance later, a revolutionary design

does the opposite. If this problem remains unsolved, you will probably have a hard time getting

commercial parties interested in your product. A long time-to-market delay also makes it difficult

  justifying research on revolutionary ideas. Since most CLDs today are evolutionary, it’s hard to find

examples of a revolutionary CLD. A revolutionary approach can however be applied to highly specific

problems where backwards compability is not important.

Applications

  Wireless Internet access

  Ad hoc networks (tactical)

  Sensor networks

Diverse requirements

  High-bandwidth video and data

  Low-bandwidth voice and data

Goal

  Reliable communication-on-the-move in highly dynamic environments

  QoS provisioning

2. Cognitive Radio

A cognitive radio is a transceiver that automatically changes its transmission or reception parameters, in

a way where the wireless communications can have spectrum agility in terms of selecting available

wireless channels opportunistically. The main process is also called dynamic spectrum management. A

cognitive radio, as defined by the researchers at Virginia Polytechnic Institute and State University, is "a

software defined radio with a cognitive engine brain

The concept of cognitive radio was first proposed by Joseph Mitola III in a seminar at KTH, the Royal

Institute of Technology in Stockholm, in 1998, published later in an article by Mitola and Gerald Q.

Maguire, Jr in 1999. It was a novel approach in wireless communications that Mitola later described as:

The point in which wireless personal digital assistants (PDAs) and the related networks are sufficiently

computationally intelligent about radio resources and related computer-to-computer communications

to detect user communications needs as a function of use context, and to provide radio resources and

wireless services most appropriate to those needs.

Page 4: Cross Layer Design Definition

8/2/2019 Cross Layer Design Definition

http://slidepdf.com/reader/full/cross-layer-design-definition 4/8

11F-MS-TE-02 Assignment 01

4

Cognitive radio is considered as an ideal goal towards which a software-defined radio platform should

evolve: a fully reconfigurable wireless transceiver that automatically adapts its communication

parameters to network and user demands.

Cognitive Radio Types

Depending on the set of parameters taken into account in deciding on transmission and reception

changes, we can distinguish certain types of cognitive radio.

  Full Cognitive Radio: in which every possible parameter observable by a wireless node or

network is taken into account.

  Spectrum Sensing Cognitive Radio: in which only the radio frequency spectrum is considered.

Depending on the parts of the spectrum available for cognitive radio, we can distinguish:

  Licensed Band Cognitive Radio: in which cognitive radio is capable of using bands assigned to

licensed users, apart from unlicensed bands, such as U-NII band or ISM band. The IEEE 802.22

working group is developing a standard for wireless regional area network (WRAN) which will

operate in unused television channels.

  Unlicensed Band Cognitive Radio: which can only utilize unlicensed parts of  radio frequency

spectrum. One such system is described in the IEEE 802.15 Task group 2 specification which

focuses on the coexistence of IEEE 802.11 and Bluetooth

Features of Cognitive Radios

  Spectrum Sensing: detecting the unused spectrum and sharing it without harmful interference

with other users. It is an important requirement of the Cognitive Radio network to sense

spectrum holes. Detecting primary users is the most efficient way to detect spectrum holes.

Spectrum sensing techniques can be classified into three categories:

o  Transmitter detection: cognitive radios must have the capability to determine if a signal

from a primary transmitter is locally present in a certain spectrum. There are several

approaches proposed:

  matched filter detection

  energy detection

  cyclostationary feature detection

o  Cooperative detection: refers to spectrum sensing methods where information from

multiple Cognitive radio users are incorporated for primary user detection.

o  Interference based detection.

  Spectrum Management: capturing the best available spectrum to meet user communication

requirements while not creating undue interference to other (primary) users. Cognitive radios

should decide on the best spectrum band to meet the Quality of service requirements over all

available spectrum bands, therefore spectrum management functions are required for Cognitive

radios. These management functions can be classified as:

o  spectrum analysis

o  spectrum decision

Page 5: Cross Layer Design Definition

8/2/2019 Cross Layer Design Definition

http://slidepdf.com/reader/full/cross-layer-design-definition 5/8

11F-MS-TE-02 Assignment 01

5

  Spectrum Mobility: is defined as the process when a cognitive radio user exchanges its

frequency of operation. Cognitive radio networks target to use the spectrum in a dynamic

manner by allowing the radio terminals to operate in the best available frequency band,

maintaining seamless communication requirements during the transition to better spectrum.

  Spectrum Sharing: providing the fair spectrum scheduling method. One of the major challenges

in open spectrum usage is the spectrum sharing. It can be regarded to be similar to generic

media access control MAC problems in existing systems

Practical Applications

CR can sense its environment and without the intervention of the user can adapt to the users

communication needs while conforming to FCC rules. Thus providing efficient use of the spectrum is a

growing concern. CR offers a solution to this problem.

A CR can intelligently detect whether any portion of the spectrum is in use or not, and can temporarily

latch into or out of it without interfering with the transmissions of other users thereby efficiently

utilizing spectrum.

3. ZigBee

ZigBee is a specification for wireless personal area networks (WPANs) operating at 868 MHz, 902-928

MHz, and 2.4 GHz. A WPAN is a personal area network (a network for interconnecting an individual's

devices) in which the device connections are wireless. Using ZigBee, devices in a WPAN can

communicate at speeds of up to 250 Kbps while physically separated by distances of up to 50 meters in

typical circumstances and greater distances in an ideal environment. ZigBee is based on the 802.15

specification approved by the Institute of Electrical and Electronics Engineers Standards Association

(IEEE-SA). ZigBee provides for high data throughput in applications where the duty cycle is low. This

makes ZigBee ideal for home, business, and industrial automation where control devices and sensors are

commonly used. Such devices operate at low power levels, and this, in conjunction with their low duty

cycle (typically 0.1 percent or less), translates into long battery life. Applications well suited to ZigBee

include heating, ventilation, and air conditioning (HVAC), lighting systems, intrusion detection, fire

sensing, and the detection and notification of unusual occurrences. ZigBee is compatible with most

topologies including peer-to-peer, star network, and mesh networks, and can handle up to 255 devices

in a single WPAN.

ZigBee has been developed to meet the growing demand for capable wireless networking between

numerous low-power devices. In industry ZigBee is being used for next generation automatedmanufacturing, with small transmitters in every device on the floor, allowing for communication

between devices to a central computer. This new level of communication permits finely-tuned remote

monitoring and manipulation. In the consumer market ZigBee is being explored for everything from

linking low-power household devices such as smoke alarms to a central housing control unit, to

centralized light controls.

Page 6: Cross Layer Design Definition

8/2/2019 Cross Layer Design Definition

http://slidepdf.com/reader/full/cross-layer-design-definition 6/8

11F-MS-TE-02 Assignment 01

6

4. 4G LTE-Advanced Technology Overview

4G LTE refers to the evolved version of LTE that is being developed by 3GPP to meet or exceed the

requirements of the International Telecommunication Union (ITU) for a true fourth generation radio-

communication standard known as IMT-Advanced. 4G LTE, whose project name is LTE-Advanced, is

being specified initially in Release 10 of the 3GPP standard, with a functional freeze targeted for March2011. The 4G LTE standard will continue to be developed in subsequent releases. In October 2009, the

3GPP Partners formally submitted LTE-Advanced to the ITU as a candidate for 4G IMT-Advanced1. The

certified technology specifications for IMT-Advanced are expected to be published in early 2011. 

Key ITU requirements for IMT-Advanced that 4G LTE will support include the following

  A high degree of common functionality worldwide while retaining the flexibility to support a

wide range of local services and applications in a cost-efficient manner

  Compatibility of services within IMT and with fixed networks

  Capability for interworking with other radio systems

  High quality mobile services

  User equipment suitable for worldwide use

  User-friendly applications, services, and equipment

  Worldwide roaming capability

  Enhanced peak data rates to support advanced mobile services and applications (100 Mbps for

high mobility and 1 Gbps for low mobility)

Major technical considerations for 4G LTE development include:

  Continual improvement to the LTE radio technology and architecture

  Scenarios and performance requirements for interworking with legacy radio access technologies

  Backward compatibility of LTE-Advanced with LTE. An LTE terminal should be able to work in an

LTE-Advanced network and vice versa. Any exceptions will be considered by 3GPP.  Account taken of recent WRC-07 decisions for new IMT spectrum as well as existing frequency

bands to ensure that LTE-Advanced geographically accommodates available spectrum for

channel allocations above 20 MHz Also, requirements must recognize those parts of the world in

which wideband channels are not available.

System performance requirements

The system performance requirements for 4G-LTE will in most cases exceed those of IMT-Advanced. The

1 Gbps peak data rate required by the ITU will be achieved in 4G LTE using 4x4 MIMO and transmission

bandwidth wider than approximately 70 MHz In terms of spectral efficiency, today's LTE (Release 8)

satisfies the IMT-Advanced requirement for the downlink, but the bps/Hz must be doubled in LTE-Advanced to meet the 4G requirement.

The peak rates for LTE-Advanced are substantially higher than the IMT-Advanced requirements, which

highlight a desire to drive up peak performance in 4G LTE, although targets for average performance are

closer to ITU requirements. However, TR 36.913 states that targets for average spectral efficiency and

for cell-edge user throughput efficiency should be given higher priority than targets for peak spectral

Page 7: Cross Layer Design Definition

8/2/2019 Cross Layer Design Definition

http://slidepdf.com/reader/full/cross-layer-design-definition 7/8

11F-MS-TE-02 Assignment 01

7

efficiency and other features such as VoIP capacity. Thus 4G LTE work will be focused on the challenges

of raising average and cell-edge performance.

5. Orthogonal Frequency-Division Multiplexing (OFDM)

OFDM is a broadband multicarrier modulation method that offers superior performance and benefitsover older, more traditional single-carrier modulation methods because it is a better fit with today’s

high-speed data requirements and operation in the UHF and microwave spectrum. it has been known

since at least the 1960s and 1970s. Originally known as multicarrier modulation, as opposed to the

traditional single-carrier modulation, OFDM was extremely difficult to implement with the electronic

hardware of the time. So, it remained a research curiosity until semiconductor and computer technology

made it a practical method. OFDM has been adopted as the modulation method of choice for practically

all the new wireless technologies being used and developed today. It is perhaps the most spectrally

efficient method discovered so far, and it mitigates the severe problem of multipath propagation that

causes massive data errors and loss of signal in the microwave and UHF spectrum.

Wireless Technologies That Use OFDM

The list is long and impressive. First, it is used for digital radio broadcasting—specifically Europe’s DAB

and Digital Radio Mondial. It is used in the U.S.’s HD Radio. It is used in TV broadcasting like Europe’s

DVB-T and DVB-H. You will also find it in wireless local-area networks (LANs) like Wi-Fi. The IEEE

802.11a/g/n standards are based on OFDM. The wideband wireless metro-area network (MAN)

technology WiMAX uses OFDM. And, the almost completed 4G cellular technology standard Long-Term

Evolution (LTE) uses OFDM. The high-speed short-range technology known as Ultra-Wideband (UWB)

uses an OFDM standard set by the WiMedia Alliance. OFDM is also used in wired communications like

power-line networking technology. One of the first successful and most widespread uses of OFDM was

in data modems connected to telephone lines. ADSL and VDSL used for Internet access use a form of 

OFDM known as discrete multi-tone (DMT). And, there are other less well known examples in the

military and satellite worlds.

OFDM working

OFDM is based on the concept of frequency-division multiplexing (FDD), the method of transmitting

multiple data streams over a common broadband medium. That medium could be radio spectrum, coax

cable, twisted pair, or fiber-optic cable. Each data stream is modulated onto multiple adjacent carriers

within the bandwidth of the medium, and all are transmitted simultaneously. A good example of such a

system is cable TV, which transmits many parallel channels of video and audio over a single fiber-optic

cable and coax cable. Today The FDD technique is typically wasteful of bandwidth or spectrum because

to keep the parallel modulated carriers from interfering with one another, you have to space them with

some guard bands or extra space between them. Even then, very selective filters at the receiving endhave to be able to separate the signals from one another. What researchers discovered is that with

digital transmissions, the carriers could be more closely spaced to one another and still separate. That

meant less spectrum and bandwidth waste. The serial digital data stream to be transmitted is split into

multiple slower data streams, and each is modulated onto a separate carrier in the allotted spectrum.

These carriers are called subcarriers or tones. The modulation can be any form of modulation used with

digital data, but the most common are binary phase-shift keying (BPSK), quadrature phase-shift keying

Page 8: Cross Layer Design Definition

8/2/2019 Cross Layer Design Definition

http://slidepdf.com/reader/full/cross-layer-design-definition 8/8

11F-MS-TE-02 Assignment 01

8

(QPSK), and quadrature amplitude modulation (QAM). The outputs of all the modulators are linearly

summed, and the result is the signal to be transmitted. It could be upconverted and amplified if needed.

Advantages of OFDM

The first reason is spectral efficiency, also called bandwidth efficiency. What that term really means isthat you can transmit more data faster in a given bandwidth in the presence of noise. The measure of 

spectral efficiency is bits per second per Hertz, or bps/Hz. For a given chunk of spectrum space, different

modulation methods will give you widely varying maximum data rates for a given bit error rate (BER)

and noise level. Simple digital modulation methods like amplitude shift keying (ASK) and frequency shift

keying (FSK) are only fair but simple. BPSK and QPSK are much better. QAM is very good but more

subject to noise and low signal levels. Code division multiple access (CDMA) methods are even better.

But none is better than OFDM when it comes to getting the maximum data capacity out of a given

channel. It comes close to the so called Shannon limit that defines channel capacity C in bits per second

(bps) as C = B x log2 (1 + S/N) 

Here, B is the bandwidth of the channel in hertz, and S/N is the power signal-to-noise ratio. With

spectrum scarce or just plain expensive, spectral efficiency has become the holy grail in wireless.

Disadvantages of OFDM

Like anything else, OFDM is not perfect. It is very complex, making it more expensive to implement.

However, modern semiconductor technology makes it pretty easy. OFDM is also sensitive to carrier

frequency variations. To overcome this problem, OFDM systems transmit pilot carriers along with the

subcarriers for synchronization at the receiver. Another disadvantage is that an OFDM signal has a high

peak to average power ratio. As a result, the complex OFDM signal requires linear amplification. That

means greater inefficiency in the RF power amplifiers and more power consumption.

Orthogonal Frequency-Division Multiplexing Access (OFMDA) 

The A stands for access. It means that OFDM is not only a great modulation method, it also can provide

multiple access to a common bandwidth or channel to multiple users. You are probably familiar with

multiple access methods like frequency-division multiplexing (FDM) and time division multiplexing

(TDM). CDMA, the widely used cellular technology, digitally codes each digital signal to be transmitted

and then transmits them all in the same spectrum. Because of their random nature, they just appear as

low-level noise to one another. The digital coding lets the receiver sort the individual signal out later.

OFDMA permits multiple users to share a common bandwidth with essentially the same benefits. The

OFDM system assigns subgroups of subcarriers to each user. With thousands of subcarriers, each user

would get a small percentage of the carriers. In a modern system like the 4G LTE cellular system, each

user could be assigned from one to many subcarriers. In LTE, subcarrier spacing is 15 kHz. Using a 10-MHz band, the total possible number of subcarriers would be 666. In practice, a smaller number like 512

would be used. If each subscriber is given six subcarriers, you could put 85 users in the band. The

number of subcarriers assigned will depend on the user’s bandwidth and speed needs.