Welcome to my website

My name is Mohammad Zakzook. I'm a Computer Science student at CMUQ.

Randy Pausch

Randolph Frederick Pausch was a computer science and human computer interaction professor in Carnegie Mellon University in Pennsylvania. He was born on October 23, 1960 in Baltimore, Maryland, US. He died of pancreatic cancer at the age of 47 on the 25th of July 2007 in Chesapeake, Virginia, US. Randy Pausch's graduated in computer science from Brown University in May 1982. He then received his computer science Ph.D. from Carnegie Mellon University.

Randy Pausch was best known for " Last Lecture: Really Achieving Your Childhood Dreams." A "last lecture" is a tradition a Carnegie Mellon University where professor gives a lecture as if it is his last ever lecture what would the professor want to pass into the world before leaving? But in Pausch's case it was truly his last lecture. Pausch was diagnosed with pancreatic cancer after accepting to do a "last lecture". In this lecture, Pausch talks about his childhood dreams and how he achieved them. He also talks directly to his children and tells them about himself so that one day they would watch the lecture.

But "The Last Lecture" is not the only thing Pausch was known for. Pausch has also helped create Alice and CMU's Entertainment Technology Center. Moreover, Pausch made many more inspiring lecturing one of which is "Time Management".

In this lecture he talks about how to make the most of your time. He equates time to a currency, which it essentially is because companies pay you for your time. Pausch also mentions the 80/20 rule. The rule states that once you become 80% proficient at something, the next 20% will be much harder to gain and will take much more time which might make it not worth it. He then mentions the importance quadrant square, which essentially states that distant important due dates are more of a priority than unimportant soon due dates. Most importantly, he thinks that a "to do list" is the most important thing. Everyone should have a "to do list" for the next year. One last thing he mentions is to find your productive time; everyone has one time of the day where they're more brilliant than the rest of the day. Finding that time allows you to do the most brain consuming work at the time then doing the systemic work at other times.

1. The 80/20 rule

2. He equates time to a currency, which it essentially is because companies pay you for your time.

3. find your productive time

4. The square of importance

5. Have a to do list

randy pausche info 1
randy pausche info 2
randy pausche info 3
randy pausche info 4
randy pausche info 5

Computer Security

In the computer industry, the term security or the phrase computer security refers to ways for ensuring that data stored in a computer cannot be read by anyone without authorization. Most computer security measures involve data encryption and passwords. Data encryption is the translation of data into a form that is unintelligible without a deciphering mechanism. A password is a secret word that gives a user access to a particular program or system.

Computer security is frequently associated with three core areas, which can be conveniently summarized by the acronym "CIA". Confidentiality which is Ensuring that information is not accessed by unauthorized persons. Integrity which is Ensuring that information is not altered by unauthorized persons in a way that is not detectable by authorized users. Authentication which is Ensuring that users are the persons they claim to be. Computer security is not restricted to these three broad concepts. Additional ideas that are often considered part of the taxonomy of computer security include: access control, non repudiation, availability, privacy.

Computer security can also be analyzed by function. It can be broken into five distinct functional areas. Risk avoidance, it is a security fundamental that starts with questions like: Does my organization or business engage in activities that are too risky? Do we really need an unrestricted Internet connection? Do we really need to computerize that secure business process? The second is Deterrence; it reduces the threat to information assets through fear. It can consist of communication strategies designed to impress potential attackers of the likelihood of getting. Prevention which is the traditional core of computer security. It consists of implementing safeguards like the tools covered in this book. Absolute prevention is theoretical, since there's a vanishing point where additional preventative measures are no longer cost-effective. Another is Detection, it Works best in conjunction with preventative measures. When prevention fails, detection should kick in, preferably while there's still time to prevent damage Including log-keeping and auditing activities. Recovery is When all else fails, be prepared to pull out backup media and restore from scratch, or cut to backup servers and net connections, or fall back on a disaster recovery facility. Arguably, this function should be attended to before the others. Analyzing security by function can be a valuable part of the security planning process; a strong security policy will address all five areas, starting with recovery. This book, however, is primarily concerned with prevention and detection.

Computer science link 1 Computer science link 2 Computer science link 3

Question : How can we improve our own computer's security

Question : What's the best recovery method

Question : Who invented anti viruses and why?

Cloud Computing

Cloud computing is the delivery of services of the computing a service rather than a product, whereby shared resources, software, and information are provided to computers and other devices as a utility over a network . Clouds can be classified as public, private or hybrid.

What cloud computing is not about is your hard drive. When you store data on--or run programs from the hard drive, that's called local storage and computing. Everything you need is physically close to you, which means accessing your data is fast and easy (for that one computer, or others on the local network). Working off your hard drive is how the computer industry functioned for decades and some argue it's still superior to cloud computing, for reasons I'll explain shortly.The cloud is also not about having a dedicated hardware server in residence. Storing data on a home or office network does not count as utilizing the cloud.For it to be considered "cloud computing," you need to access your data or your programs over the Internet, or at the very least, have that data synchronized with other information over the Net. In a big business, you may know all there is to know about what's on the other side of the connection; as an individual user, you may never have any idea what kind of massive data-processing is happening on the other end. The end result is the same: with an online connection, cloud computing can be done anywhere, anytime.

InfoWorld talked to dozens of vendors, analysts, and IT customers to tease out the various components of cloud computing. Based on those discussions, here's a rough breakdown of what cloud computing is all about:

First is SaaS.SaaS is the type of cloud computing delivers a single application through the browser to thousands of customers using a multitenant architecture. Second is utility computing. The idea is not new, but this form of cloud computing is getting new life from Amazon.com, Sun, IBM, and others who now offer storage and virtual servers that IT can access on demand. Third is Closely related to SaaS, Web service providers offer APIs that enable developers to exploit functionality over the Internet, rather than delivering full-blown applications. They range from providers offering discrete business services to the full range of APIs offered by Google Maps, ADP payroll processing, the U.S. Postal Service, Bloomberg, and even conventional credit card processing services.

Fourth is"Platform as service" Another SaaS variation, this form of cloud computing delivers development environments as a service. You build your own applications that run on the provider's infrastructure and are delivered to your users via the Internet from the provider's servers.

Second is utility computing. The idea is not new, but this form of cloud computing is getting new life from Amazon.com, Sun, IBM, and others who now offer storage and virtual servers that IT can access on demand. Early enterprise adopters mainly use utility computing for supplemental, non-mission-critical needs, but one day, they may replace parts of the datacenter. Other providers offer solutions that help IT create virtual datacenters from commodity servers, such as 3Tera's AppLogic and Cohesive Flexible Technologies' Elastic Server on Demand. Liquid Computing's LiquidQ offers similar capabilities, enabling IT to stitch together memory, I/O, storage, and computational capacity as a virtualized resource pool available over the network.

Cloud link 1 Cloud link 2 Cloud link 3

Question 1: who started cloud computing

Question 2: First cloud system>

Question 3: How are the clouds connected cross content?

Theory of Computation

Computations are designed for processing information. They can be as simple as an estimation for driving time between cities, and as complex as a weather prediction. The study of computation aims at providing an insight into the characteristics of computations. Such an insight can be used for predicting the complexity of desired computations, for choosing the approaches they should take, and for developing tools that facilitate their design. The study of computation reveals that there are problems that cannot be solved. And of the problems that can be solved, there are some that require infeasible amount of resources (e.g., millions of years of computation time). These revelations might seem discouraging, but they have the benefit of warning against trying to solve such problems. Approaches for identifying such problems are also provided by the study of computation. On an encouraging note, the study of computation provides tools for identifying problems that can feasibly be solved, as well as tools for designing such solutions. In addition, the study develops precise and well-defined terminology for communicating intuitive thoughts about computations.In theoretical computer science and mathematics, the theory of computation is the branch that deals with how efficiently problems can be solved on a model of computation, using an algorithm. The field is divided into three major branches: automata theory, computability theory, and computational complexity theory. The theory of computation can be considered the creation of models of all kinds in the field of computer science. Therefore, mathematics and logic are used. In the last century it became an independent academic discipline and was separated from mathematics.

Computability theory deals primarily with the question of the extent to which a problem is solvable on a computer. The statement that the halting problem cannot be solved by a Turing machine is one of the most important results in computability theory, as it is an example of a concrete problem that is both easy to formulate and impossible to solve using a Turing machine. Much of computability theory builds on the halting problem result.

Complexity theory considers not only whether a problem can be solved at all on a computer, but also how efficiently the problem can be solved. Two major aspects are considered: time complexity and space complexity, which are respectively how many steps does it take to perform a computation, and how much memory is required to perform that computation. In order to analyze how much time and space a given algorithm requires, computer scientists express the time or space required to solve the problem as a function of the size of the input problem. For example, finding a particular number in a long list of numbers becomes harder as the list of numbers grows larger. If we say there are n numbers in the list, then if the list is not sorted or indexed in any way we may have to look at every number in order to find the number we're seeking. We thus say that in order to solve this problem, the computer needs to perform a number of steps that grows linearly in the size of the problem. To simplify this problem, computer scientists have adopted Big O notation, which allows functions to be compared in a way that ensures that particular aspects of a machine's construction do not need to be considered, but rather only the asymptotic behavior as problems become large. So in our previous example we might say that the problem requires O(n) steps to solve. Perhaps the most important open problem in all of computer science is the question of whether a certain broad class of problems denoted NP can be solved efficiently. This is discussed further at Complexity classes P and NP.

Theory link 1 Theory link 2 Theory link 3

Question 1: What is the halting problem?

Question 2: Is theory more about math or computation?

Question 3: How did theory start?