AIA Week 6

Advanced Internet Development @ Georgia Institute of Technology.

 

MENU

1) AIA Week 5

2)Artificial Project 1 Uninformed and Informed Search Algorithms.

3) Artificial Intelligence Project on Alpha Beta Pruning.

4) Computer Security ( Bell LaPadula Model)

5) Critical Essay and Analysis Part 2

6) Critical Essay and Analysis Part 1.

7) CS8803 AIA @ Georgia Institute of Technology

8) Explicit and Implicit DLL Linking

9) Explicit and Implicit DLL Linking

10) Field Hiding and Method Overriding

11) Fun With Pointers in C++

12) CS 8803 AIA Paper 2Title: Sleepers and Workaholics: Caching Strategies in Mobile Environments.

13) CS 8803 Week 3 Paper Critque

14) Requirements

15) VirtualFunctions C++

CS 8803 AIA
[2.1 WS] 1
Strengths:
-The main theme and purpose of the paper is to present
to the users the problems and issues associated with 
the generation of synthetic Web server traffic in a 
testbed environment. These issues and problems with 
the synthetic web traffic generation serves as a basis 
for the alternative and more robust methodology of 
generating web traffic presented in the paper. 

- In the first half of the paper until section 4.0,
the problems with synthetic traffic generation is 
presented to the reader. The authors are very clear 
and concise on the problems. One major and obvious 
problem, there is a limit to the number of clients 
that can be put on a LAN due to economic and financial 
constraints.

- It cannot generate large web traffic. The
inability to generate web traffic is because the 
traditional synthetic model assumes HTTP requests 
generated are independent of the think time of the 
clients. This a fundamental flaw and the authors correct 
this by making it highly dependent on factors such as 
human users sleep and wake patterns. It is this high 
correlation, that allows the new model to generate high 
traffic that is also burstiness. This concept allows
 now the generated rate of requests from clients to
 exceed the capacity of the server which happens in 
realistic and actual working scenarios on the Internet 
as seen in Figure 2 of the paper. 

-This new methodology by the authors now allow to
evaluate web servers performance under overload 
conditions. This allows researches and developers to
 take into account the poor performance of Unix 
machines under heavy load by simulating actual 
requests from clients. 

Weakness:
- The authors from a very high level understand the
nature of traffic that needs to be simulated in a 
testbed environment. This is't conclusive enough. 
The authors dont take into account the characteristics 
of client based traces. The authors perhaps might have 
considered taking into account the actual traces of 
clients on a given subset of the Internet. Based on 
this, by performing statistical analysis on the variance 
and the mean distribution, a simulated client request
 generation could have been generated. 

-The importance of statistical analysis is that a more
thorough generation of client requests based on actual 
world events, popular culture and economic conditions 
a client requests could be generated. 

-The addition of a router in the network is to allow
the simulation of the WAN delays. The authors could be
 more precise by making the WAN delay a function of 
routers within a certain tolerance limit. This again
 requires more work and statistical analysis but allows
 future researchers to simulate web traffic accurately 
as a mathematical function within a certain acceptable 
tolerance limit.