Wednesday 27 February 2013

EMERGING TECHNOLOGIES



 NANOTECHNOLOGY: This is science, technology from a very infinitesimal point of consideration that is thee nanoscale which is about 1 to 100 nanometers.
      Nanoscience and nanotechnology are the study and application of extremely small things and can be used across all the other science fields, such as chemistry, biology, physics, materials science, and engineering. Nanotechnology is not just a new field of science and engineering, but a new way of looking at and studying
                       BASIC CONCEPT OF NANOSCIENCE

It’s hard to imagine just how small nanotechnology is. One nanometer is a billionth of a meter, or 10-9 of a meter. Here are a few illustrative examples:
  •  There are 25,400,000 nanometers in an inch
  • A sheet of newspaper is about 100,000 nanometers thick
  • On a comparative scale, if a marble were a nanometer, then one meter would be the size of the Earth


pictures of how nanaparticles are connected and look 


Nanoscience and nanotechnology involve the ability to see and to control individual atoms and molecules. Everything on Earth is made up of atoms—the food we eat, the clothes we wear, the buildings and houses we live in, and our own bodies.

GRID COMPUTER:   Imagine several million computers from all over the world, and owned by thousands of different people. Imagine they include desktops, laptops, supercomputers, data vaults, and instruments like mobile phones, meteorological sensors and telescopes...

Now imagine that all of these computers can be connected to form a single, huge and super-powerful computer! This huge, sprawling, global computer is what many people dream "The Grid" will be.


this is how the idea of grid computer looks like.

 "The Grid" takes its name from an analogy with the electrical "power grid". The idea was that accessing computer power from a computer grid would be as simple as accessing electrical power from an electrical grid".
     Though the concept isn't new, it's also not yet perfected. Computer scientists, programmers and engineers are still working on creating, establishing and implementing standards and protocols. Right now, many existing grid computer systems rely on proprietary software and tools. Once people agree upon a reliable set of standards and protocols, it will be easier and more efficient for organizations to adopt the grid computing model. We may consider the server of an cyber cafe as a grid computer .

QUANTUM COMPUTERS: Quantum computing is the area of study focused on developing computer technology based on the principles of quantum theory, which explains the nature and behavior of energy and matter on the quantum (atomic and subatomic) level. Development of a quantum computer, if practical, would mark a leap forward in computing capability far greater than that from the abacus to a modern day supercomputer, with performance gains in the billion-fold realm and beyond. The quantum computer, following the laws of quantum physics, would gain enormous processing power through the ability to be in multiple states, and to perform tasks using all possible permutations simultaneously. Current centers of research in quantum computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory.


quantum computer blueprint

                 brief history on the origin of quantum computers 

    The essential elements of quantum computing originated with Paul Benioff, working at Argonne National Labs, in 1981. He theorized a classical computer operating with some quantum mechanical principles. But it is generally accepted that David Deutsch of Oxford University provided the critical impetus for quantum computing research. In 1984, he was at a computation theory conference and began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, then published his breakthrough paper a few months later. With this, the race began to exploit his ideas. However, before we delve into what he started, it is beneficial to have a look at the background of the quantum world. The biggest question in my mind and probably the minds of other is that how well is this going to be authentic and function? Lets keep our fingers cross until that time!

SEMANTIC WEB:   Semantic Web is focused on machines  unlike the web 2.0 which focus on humans. The Web requires a human operator, using computer systems to perform the tasks required to find, search and aggregate its information. It's impossible for a computer to do these tasks without human guidance because Web pages are specifically designed for human readers. The Semantic Web is a project that aims to change that by presenting Web page data in such a way that it is understood by computers, enabling machines to do the searching, aggregating and combining of the Web's information — without a human operator.


semantic web scenario.

WHO USES SEMANTIC WEB?
  It has taken years to put the pieces together that comprise the Semantic Web, including the standardization of RDF, the W3C release of the Web Ontology Language (OWL), and standardization on SPARQL, which adds querying capabilities to RDF. So with standards and languages in place, we can see Semantic Web technologies being used by early adopters.
Semantic Web technologies are popular in in areas such as research and life sciences where it can help researchers by aggregating data on different medicines and illnesses that have multiple names in different parts of the world.  On the Web, Twine is offering a knowledge networking application has been built with Semantic Web technologies. The Joost online television service also uses Semantic technology on the backend. Here Semantic technology is used to help Joost users understand the relationships between pieces of content, enabling them to find the types of content they want most. oracle offers a Semantic Web view of its Oracle Technology Network, called the OTN Semantic Web to name a few of those companies who are implementing Semantic Web technologies. Let us see how its works!

  FOOD FOR THOUGHT 

 Will the world cease to exist if technology stops to advance?


Tuesday 26 February 2013

IS HANDING IT COMPLEXITY AN OPTION?




 GENERAL VIEW POINT 
  
        They issue of complexity is something that is unavoidable as far business and other relation projects are concern. Since much money and time is always invested in these IT projects, much has to be done to ensure that these projects are not a fiasco. But could this be avoided totally? If not what are the measures and procedures implemented to curb this hazard? So dealing with such a scenario is to accept the fact that does exist; if not at the moment, in the nearest future. Having in mind this, is going to do much good to your company.   
      Really avoiding complexity is not an option for CEO s but the choice comes when they have to respond to it. I do 100% abide with this statement because as  far as business is concern, there are a lot of uncertainties despite all the measures that could have been taken to avoid this. A good CEO having this kind of scenario in mind is going to make all necessary back up plans (plan B) so that nothing takes him/her unaware and found wanted. But then with their level of awareness and precaution how well do they handle it?.
  They cannot allow complexity to be a threatening force to either their profit nor to overwhelm their customers and employees. From my own point of view, treating complexity as it comes is a better way of handling it. As earlier said before, complexity is an unavoidable scenario so if one makes up his/her (CEO s) mind to expect this then he will obviously take precaution towards it happening in due course. It is often said that, “Prevention is better than cure” but in this case we have to do things the other way round. Trying to stop or avoid complexity is like trying to stop rain from falling. Why do I say so? This is because it something that is natural and is liable to happen at any time without any notification.
  But with good innovations and a good feasibility studies before embarking into a project of this nature and taking all necessary factors into account, one limits the probability of the project failing but then if it does occur how do we receive it?  It is by surprise or awareness? So again for as to me responding to it as it comes is the best decision.

 food for though :

 It doesn't matter how much you want. What really matters is how much you want it. The extent and complexity of the problem does not matter was much as does the willingness to solve it.

Wednesday 6 February 2013

USABILITY IN SOFTWARE DESIGN


   

Like user centered design, usability in software design also put the user at the center of rather then the system.The term “usability” in the context of creating software represents an approach that puts the user, rather than the system, at the center of the process. This philosophy, called user-centered design, incorporates user concerns and advocacy from the beginning of the design process and dictates that the needs of the user should be foremost in any design decisions.


GENERAL VIEW:




      The most visible aspect of this approach is usability testing, in which users work and interact with the product interface and share their views and concerns with the designers and developers.All theses are done to enable the user to analysis the system and give their feedback  for any possible amendment.








WHY BORDER ?

  Yes really why border to include user centered design in our software design ?. As mentioned early, a good system no matter how tremendously sophisticated it is; if is shows some degree of complexity to the user, it will never be used to wide range and this will NOT satisfy its intended purpose. This will really be a massive fiasco and as we all know, IT is trying to fight again all these complexities in the latest innovations.

  With a better design comes better acceptance from users. The benefit of increased buy-in with retail software is obvious: increased sales. Acceptance is also important with software developed for internal use: increased buy-in leads to increased productivity and a diminished need for support. Visibly involving users from the beginning of development also shows them that you are interested in their concerns and needs, which increases their willingness to help you develop better software.




REFERENCES:

  • http://msdn.microsoft.com/en-us/library/ms997577.aspx
  • http://www.usability.gov/basics/ucd/index.html


Monday 4 February 2013

BASIC DIFFERENCES IN COMPUTER DECIPLINES



 


   


OVERVIEW .

There are five major disciplines of studies that computing offers. There could be other disciplines but we are focus here just on the main five. These five disciplines are :
  • Computer Science
  •  Software Engineering
  • Information Systems
  • Cognitive Science
  • Computer Engineering

  



The summary of all the computer disciplines can be seen in the table. The degree to which a particular aspects is done is directly proportional to the number of plus (+) that accompany it.

           Disciplines
Aspects

Computer Science
Software Engineering
Information Systems
Cognitive Science
Computer Engineering
Organization issues and information system

    +++

           +

      +++++

         _

           _
Application technologies

      ++

        ++

      ++++++

          +

          +
Software methods and technologies

     +++

     ++++++

      +++++

           +

         ++
System infrastructure

 ++++++

         ++

         ++
      
           _

      +++
Computer hardware and architecture

       _

            _

           _

            _

    ++++++
Health related issues

      +

          +

          ++
  
   ++++


          _
 
  

   For this statement- “Most CS people laugh at MIS/IT people,”and “MIS/IT people make more money and manage the CS folks.”there is no doubt that computer science has got more opportunities than IT as can be seen from the diagram below 


    On the issue of looking down of some members of an organisation is wrong absurd because each person is a master in his/her own field and area of work. Though i have no work experience to see how IT personnel make more money than CS personnel, but to the best of my knowledge; many IT personnel are CEO s  of their own companies. In a global nod, i think the issue of money making depends on the terms of the contract, working experience, longevity of service and lots more not only on the issue of "who is who".