الأحد، 30 ديسمبر 2012


edge computing
 

Definition edge computing :

Should be the correct term makes what is the best laptop to meet the needs of the user,

as there are a whole range of laptops - models with different specifications there really is not the best laptop. To find the laptop that's right for you, we need first to make sure the basics that should be just about every laptop - wired and wireless networks capabilities - at least 2 GB of RAM and hard drive should be a minimum 80 GB merely for Each version of the latest operating systems, applications and games than earlier versions that require more disk space, and today 40 GB hard drive is simply not large enough.

What should be taken? Size / OS portability - the processor or CPU - Graphics - USB - CD / DVD drive - the size of the screen - the aesthetics - the role of mobile computers - "Save money."

Size Weight and portability - the most common screen sizes on the laptop about 13-15 inches, this is the most common, it means the lowest prices when compared to laptops smaller or larger with the same specification, which is nice if your overseas to save some money , and if you travel around a lot and then lugging around a laptop heavy can be more in charge of physical of the total price of the laptop, where it makes a lot of travelers smaller type of laptop real blessing, you have to think also, even if the computer is not Mobile itself too heavy, and add weight to the AC adapter / power supply, and the state, in addition to other small pieces you need to do, and can make them very heavy, especially if you are around for long periods of time.

Really can not really be said the same thing with the 17 "laptops, if you're looking for one with a bigger screen thats OK, but if you need to travel a lot why not buy small laptop and then put extra cash toward getting an external monitor that will be greater, and can also be used at the same time as a laptop screen, in fact give additional display space.

CPU / Processor - CPU has changed a lot over the past few years, and now seems companies are bolting for more of the basics of the processor to get more CPU power from a single chip. Currently a laptop need to bilateral nucleus CPU to run the newest generation of applications as well, and as such dual-core processors are standard at the moment, the two most common AMD Turion 64 × 2 and Intel Core Duo, and both very good from the unit CPU, but may be found on the AMD be on the edge because they are a bit cheaper, although if you do not mind waiting and you have money there are some quad super fast core laptops coming to the market at any time now.

Vista or XP - if you want a laptop that contains the operating system Windows XP, but I have found only can be purchased with the operating system Windows Vista installed (which is very common) Make sure actually install XP before your Vista scanning away and simply replace it with XP, and a lot of new computers will not be XP drivers written for it, and as such this could create big problems when you try to go to the XP operating system. In this case, most people wasting more time to re-install the operating system Vista again.

Graphics cards - generally not possible to replace or upgrade your video capabilities on a laptop computer, and is usually whatever the installed video card thats stuck What are laptops your own, if you are in the other games or need high-end graphic performance laptop you will need to a graphics card with dedicated memory, otherwise the drawing performance of your application will suffer if it involves deep in high-end graphics or games.

Ports USB-can never have enough, and the amount of the average of the USB ports on your laptop is 4, and this may seem like a lot, but when you connect peripherals such as campaign pen keyboard - mouse printer ---- would not seem a lot, and of course case can always buy an external hub, as it is very cheap, but the best option to buy a laptop with more ports in the first place. Another point worth checking; should be for each laptop USB type outlet, but worth checking to confirm as 1 USB about 10 times slower.

Almost all new laptops should now be dual-layer DVD writer drive will do just about anything, such as reading and writing to the Conference on Disarmament, and digital video discs and reading and playing them well.

Screen size - and this also includes the above to a certain degree with the passage portability, more will be heavier screen, in addition to energy consumption will be more too, so if you invest in a battery heavy it will get even heavier, and there are also some technologies screen beautiful selection including Tru Bright shiny position on the screen.

Aesthetics or appearance of the laptop - which would you prefer, sluggish laptop that looks nice or lightning speed device like a brick house? When you search for a laptop there trying very large range to choose from, if you can not find what your looking for with one of the manufacturer or another type and you actually should not be a compromise in the specifications or appear, But if you have to compromise compromise a little bit on the looks but not the specifications.

The role of laptops - what will your laptop surfing the Internet used to the public, and be - documents word, can do quite a lot of any laptop standing on his head, so it is not a world of laptops your oyster. Business applications - spreadsheets and other large, then the mid-power will be fine, you do not need to worry too much about the capabilities of the drawing and the type of shared memory should be fine. Graphics high end games or require more powerful processors and graphics cards, and be prepared to dig deep if you want to play the latest games with all the additional impacts that have turned.

Save Money - If you like your laptop good and save some money (who would not) have a small number of options available, how about a laptop user Thani, you can save a lot of money this way, but you really need to know what you're looking for or can that you end up getting a laptop with problems, or one that is not as advertised, in addition to that it is still under warranty can be taken too risky.

Refurbished laptops provide safer alternatives to save money, should also get some sort of guarantee, but this varies among retailers.
Often cellars deal offers - Goods too many special offers that can be an exceptional value, check online at site stores to models laptop that is the end of the line or has been replaced by models later, there machines very good it may be this way and you should get to ensure full, but there are always people to look for this to be quick and check often, and many of these end up making a profit at sites like e-Bay.


Edge computing has many advantages:

*Edge application services significantly decrease the data volume that must be moved, the consequent traffic, and the distance the data must go, thereby reducing transmission costs, shrinking latency, and improving quality of service (QoS).

*Edge computing eliminates, or at least de-emphasizes, the core computing environment, limiting or removing a major bottleneck and a potential point of failure.

*Security is also improved as encrypted data moves further in, toward the network core. As it approaches the enterprise, the data is checked as it passes through protected firewalls and other security points, where viruses, compromised data, and active hackers can be caught early on.

*Finally, the ability to "virtualize" (i.e., logically group CPU capabilities on an as-needed, real-time basis) extends scalability. The edge computing market is generally based on a "charge for network services" model, and it could be argued that typical customers for edge services are organizations desiring linear scale of business application performance to the growth of, e.g., a subscriber base.
 

Grid computing

Edge computing and Grid computing are related. Whereas Grid computing would be hardcoded into a specific application to distribute its complex and resource intensive computational needs across a global grid of cheap networked machines, edge computing provides a generic template facility for any type of application to spread its execution across a dedicated grid of prepared expensive machines

Edge computing provides application processing load balancing capacity to corporate and other large-scale web servers. It is like an application cache, where the cache is in the Internet itself. Static web-sites being cached on mirror sites is not a new concept. Mirroring transactional and interactive systems are however a much more complex endeavor.


Ideal distributed computing

Spector pointed out that

Distributed computing is 30 years old, but, not very deeply understood until recently*

There was a limitation of understanding of (truly) large-scale, open integrated distributed systems.

Particular aspects of distributed systems that had not been deeply understood included:

 *Requirements for systems in which the application needs (and APIs) are not known in advance.

 *Systems with 10^6 or even 10^7 processes, with consequent enormous complexity.

Spector claimed that – as in the case of transparent processing – “there has been lots of incremental progress done with distributed systems, picking away at problem areas.

Improvements that can be expected for huge distributed systems of computers, arising from computer science research, include:

 Online system optimization*

Data checking – verifying consistency and validating data/config files*

 *Dynamic repair – eg find the closest feasible solution after an incident (computer broke down)

 .Better efficiency in energy usage of these systems*

 .Improvements in managing security and privacy.*

Hybrid, not Artificial Intelligence

Hybrid intelligence is like an extension of distributed computing: people become part of the system that works out the answers.

Spector said that Google’s approach was:

To see if some problem can be solved by people and computers working together.

As a familiar example, Search doesn’t try to offer the user only the one best result.  It provides a set of results, and relies on the user picking answers from the list generated by the computer.

Hybrid intelligence can be contrasted with AI (artificial intelligence):

 *AI aims at creating computers as capable as people, often in very broad problem domains.  While progress has been made, this has turned out to be very challenging;

 *Instead, it has proven more useful for computers to extend the capability of people, not in isolation, and to focus on more specific problem areas.

Computer systems can learn from feedback from users, with powerful virtuous circles.  Spector said that aggregation of user responses has proven extremely valuable in learning, such as:

 feedback in ranking of results, or in prioritising spelling correction options.*

 semi-supervised image content analysis / speech recognition / etc.*

Ongoing research

As I viewed this video, part of my brain told me that perhaps I should return to an academic life, in the midst of a computer science faculty somewhere in the world.

I share Spector’s conclusion:

It’s a time of unprecedented diversity and fertility in computer science – and amazing challenges abound;

The results from computer science should continue to make the world a better place.

Spector pointed out that key research challenges are published on the Google Research Blog. Examples he listed included:

· increasingly fluid partnership between people and computation;

· fundamental changes in the methods of science;

· rethinking the database;

· CS+X, for all X (how Computer Science, CS, can assist and even transform other fields of study, X);

· computing with ultra-low power (eg just ambient light as a power source). 

Chapel Hill, North Carolina:

This workshop will address some recent developments on new commodity architectures, including GPUs, multi-core CPUs, the Cell processor, PPU and other emerging commodity architectures. Some of the issues to be examined in the workshop include the software challenges that arise in programming these new commodity architectures and their impact on different applications and high-performance computing. The workshop will bring together leading researchers and designers from academia, research labs, industrial organizations and federal agencies. The workshop is intended to be both technical and strategic in nature. The objectives of the workshop include the following:

  • To review the state-of-art research in compiler, software environments, scientific computation, high-performance computing, GPGPU, computer architecture and related areas;
  • To identify areas of critical needs where further research can advance the state of technology and/or where the application can provide the impetus for basic scientific development in cutting-edge computing on new commodity architectures;
  • To create a forum for discussion on addressing these critical issues with a constructive evaluation on research focus towards a "multidisciplinary" coordinated effort for collaboration;
  • To provide insights for future research directions and potential new research initiatives in these areas.

The workshop events will consist of invited presentations given by renowned researchers, panel discussions on advantages and the trade-off of various architectures and the computing needs of various applications, contributed poster presentations and live research demonstrations, and expanded breaks allowing for extensive discussion. For more detail, please refer to the following:


Focused Topics

Given the increasing power and interests on using new commodity architectures for different applications and high-performance computing, this workshop will explore cutting-edge computing using these architectures. Some of the issues include:

  • Do these commodity architectures have the potential for a wide variety of applications and computing needs? What are their algorithmic and architectural niches and can they be broadened?
  • What are the major issues in terms of programmability, language and compiler support and software environments for these new commodity architectures?
  • What are some of the future technology trends that can lead to more widespread use of these commodity architectures?
  • How can these commodity architectures be used as a mainstream processor for high-performance computing?
  • How can these processors be used for other applications including database queries, data mining, physical simulation, etc.?

 

Overlay Web Service Network

We propose Overlay Web Service Network (OWSN), an approach for integrated Web services based on a CDN (Content Delivery Network). Service integration workflows are deployed on edge servers located close to service consumers. A component service provider can utilize a message cache proxy on the edge servers or deploy its own application-specific proxy. These proxies, which we call ``service frontend modules,'' are responsible to manage communication through the Internet between edge servers and the original service sources, which we call ``service backend modules.'' The frontend and backend modules can use private protocols suitable for the application and provide a certain service level (e.g., response time) to the integrator's workflow. The workflow does not have to be aware of such application-specific protocols. Instead, it can invoke WSDL request-response operations with specification of service level requirement including conditions of failure and priorities between multiple service level metrics (e.g., response time and data freshness).

Business Roles


As illustrated in Figure 1, we assume the following roles to be acted by business organizations involved in integrated Web service delivery from component services to service end users: A service provider exposes service components. It deploys service frontend modules (such as a cache proxy) on a service manager's servers. It manages execution of service backend modules which communicate with the service frontend modules. A service integrator creates integrated Web services by combining service components. It deploys Web service workflows on a service manager's servers. A service manager hosts service integrators' workflows and service providers' frontend modules. It manages execution of those modules, monitors service levels, and provides accounting services to both service providers and integrators. With UDDI service registries, it may also work as a broker between service providers and service integrators and between service integrators and service consumers. A service consumer finds an integrated service provider through the service manager's registry, requests integrated services to the service integrator, and is bound to the service manager's service endpoint (i.e., an edge server).

 

 

Edge Computing Framework

·         Mirror Image provides a unique, framework-based approach which makes it fast and simple for customers to deploy and execute customized business logic on the Mirror Image Content Delivery Network. The framework was developed by identifying commonly occurring edge-specific criteria and actions in customers’ applications, and standardizing them in a flexible, extensible framework so that our customers do not need to “re-invent the wheel” each time that new edge logic needs to be created and deployed on our network.

·         The Mirror Image Edge Computing Framework Architecture

 

 

                   References                                                                                                         

                                                                                                                     http://en.wikipedia.org/wiki/Edge_computing


Thank you very much

السبت، 29 ديسمبر 2012

مشاركتي


business continuity planning

 

What is business continuity planning?

Critical services or products are those that must be delivered to ensure survival, avoid causing injury, and meet legal or other obligations of an organization. Business Continuity Planning is a proactive planning process that ensures critical services or products are delivered during a disruption.

A Business Continuity Plan includes:

§  Plans, measures and arrangements to ensure the continuous delivery of critical services and products, which permits the organization to recover its facility, data and assets.

§  Identification of necessary resources to support business continuity, including personnel, information, equipment, financial allocations, legal counsel, infrastructure protection and accommodations.

Having a BCP enhances an organization's image with employees, shareholders and customers by demonstrating a proactive attitude. Additional benefits include improvement in overall organizational efficiency and identifying the relationship of assets and human and financial resources to critical services and deliverables.

Why is business continuity planning important

Every organization is at risk from potential disasters that include:

§  Natural disasters such as tornadoes, floods, blizzards, earthquakes and fire

§  Accidents

§  Sabotage

§  Power and energy disruptions

§  Communications, transportation, safety and service sector failure

§  Environmental disasters such as pollution and hazardous materials spills

§  Cyber attacks and hacker activity.

Creating and maintaining a BCP helps ensure that an institution has the resources and information needed to deal with these emergencies.

 

Business continuity software

 

Latest information

New version of OpsPlanner business continuity software available

 New features include a ‘preparedness assessment’ feature; ISO 22301 program support; and dependency-based plan and recovery workflow.

COOP Systems aligns business continuity software with ISO 22301

 COOP Systems has announced that its myCOOP business continuity planning and management software now supports the full life cycle of the new ISO 22301 business continuity standard.

INONI releases budget business continuity planning solution

 INONI, a UK based provider of business continuity management software, has announced changes and improvements to its INONI Lite product.

CLIO Planner: a new business continuity planning solution from Badger Software

 CLIO Planner is highly intuitive and needs a minimum of training and practice.

 Business Continuity Manager’ enters the business continuity software market

 New software from LockPath is part of a wider GRC package.

 

هذا رابط بسيط ل BCP - Business Coninuity Plan Solution by Wallace Wireless

http://www.youtube.com/watch?v=4AZDsjV16iI

الخميس، 20 ديسمبر 2012

LAN vs WAN

هذا رابط الفيديو
https://www.youtube.com/watch?v=siJhNLYnFfc

Disadvantages Of WAN


§Are expensive and generally slow.
§Need a good firewall to restrict outsiders from entering and disrupting the network.
§Setting up a network can be an expensive and complicated experience. The bigger the network the more expensive it is.
§Security is a real issue when many different people have the ability to use information from other computers. Protection against hackers and viruses adds more complexity and expense.
§ Once set up, maintaining a network is a full-time job which requires network supervisors and technicians to be employed.
§ Information may not meet local needs or interests.
§ Vulnerable to hackers or other outside threats.

Wide area network(WAN)

هذا رابط فيديو بسيط يوضح (WAN)

https://www.youtube.com/watch?v=hFfdRyN1yaM&noredirect=1

Advantages Of WAN

§Covers a large geographical area so long distance businesses can connect on the one network.
§Shares software and resources with connecting workstations.
§Messages can be sent very quickly to anyone else on the network. These messages can have pictures, sounds, or data included with them (called attachments).
§Expensive things (such as printers or phone lines to the internet) can be shared by all the computers on the network without having to buy a different peripheral for each computer.
§Everyone on the network can use the same data. This avoids problems where some users may have older information than others.
§Share information/files over a larger area .
§large network cover.