The Theory of Computation/Computing
It is one of the broad and rapidly growing subfield of CS that helps to solve problems using algorithms and protocols, on a model of computation. The most commonly used model of computation is Turing machine. The theory of computation is mainly categorized into three components. These key components are Automata Theory/Finite Automata, Computability Theory/Computability and Logic, and Computational Complexity Theory.
Applications of theory of computation/computing that I found interesting: Secure cryptography, search engine, computational biology, quantum computing, pseudo-randomness, Big-Data and machine learning.
It is the study, generation and application of methods to maintain a secure connection so that no adversaries or harmful third parties can access users’ private information. The cryptographers ensure this safety by blocking the likely protocols used by the adversaries.
Limitations: Highly complex encryptions (the security algorithm) are not allowed in many countries, as these might sometimes point towards its users with suspicions of spying and sabotage. Also, the cryptosystems (companies that handle the all the information of their users) are highly under risk of getting attacked by adversaries, and if the adversaries succeed to do so even once, it will prove to be very beneficial to them.
Unlike PCs, workstations and Turing machines that only work on two bits (0, 1), the quantum computers will seemingly “understand” more as they have special quantum bits called qubits. The qubits come in the form of atoms, photons, electrons and ions. The qubits have the power to be in superposition. So while PCs can work on only a computation at a time, the quantum computers are believed to run a million computations at a given time.
Limitations: Since quantum computers are expected to work based on power driven from subatomic particles, users will have a likelihood of causing a contamination in the state of these particles, and therefore change their values. (Also known as entanglement).
It is the study of incorporating theoretical methods, simulation techniques, mathematical modelling and so on to develop and speed up the advances in the field of biology. With the advent of Computational Biology, biologists can store and process large amounts of data, which is why analysis of such data has become more accurate. The Human Genome Project is a remarkable product of this field.
As I once used SPSS statistics software in a research project that included a large sample size, I know how efficient it is. I found the various ways in which the software can process the provided input to be very helpful. Researchers can utilize these tools to help them analyse the outcomes and give a desired shape to their outcomes.
#How random is a “random number” according to theoretical computer scientists?
#How does the Turing machine compute the values that deviate from a pattern?
#How is theory of computation related to computational biology?
#Cryptography- A way to safety? Or loss of privacy?
Useful links below:
Theory of Computation at MIT
How Quantum Computers work
Introduction to Computational Biology-Cornell University