Harry Wen @ CMUQ

Prof Riley/Security

According to BankInfoSecurity, in 2016, some group of hackers broke into Qatar National Bank's database and leaked 1.4 GBs of sensitive data containing important banking files and customer financial information, yet the means by which the hackers obtained the data remains unclear (it could also have been an inside job). Nearly 1 million card numbers and passwords as well as cardholder information were leaked, affecting the bank's clients. Luckily the bank required also verification by registered mobile phone. The bank conducted further investigations but did not release details.

As reported by Reuters, in 2017, attackers hacked into the Qatar News Agency's website and social media accounts and faked quotes attributed to Qatar's Emir to drive a wedge between Qatar and its neighbors. The investigation team claimed that the attackers used advanced hacking techniques and targeted at the vulnerabilities in the systems, but did not disclose the exact method they adopted. Qatar's diplomatic relations with Arab states around were severely impacted, as Saudi Arabia, Egypt, and United Arab Emirates, etc. broke their ties with Qatar subsequently. The investigation by Qatar's Foreign Ministry was assisted by American and British cybersecurity agencies and confirmed the attack but did not confirm exactly who did it. Rumors were Russian hackers committed it.

KA IBM CA

Prof Giselle/PL

Programmers initially used punch cards to code. These punch cards were inefficient and had zero tolerance for error. Since a program was stored on a large stack of paper, it was impossible to carry it with you and extremely difficult to do debugging. However, the breakthrough in transistor technology made programming languages came to life, so programmers could finally get a better and more accessible interface.

Countless programming languages emerge because they are constantly evolving to suit different purposes. For example, the likes of JS are more for web development, Python is more for complex calculations, C manages low-level details regarding primarily computer systems, Java is more for games, and there are many more specialized programming languages for games, examples of which include Unity and 3D rendering engines.

Python, one of the programming languages I know, is notorious for its slowness and inefficiency especially when compared to low-level languages like C, but the exchange is simpler code development and user friendliness. But still, it is always best if there is a perfect language that can achieve both.

In order to create a new programming language, I would have to think about how to simplify its ways of abstracting things and at the same time maintain a good efficiency and management of memory. I would probably need to entirely rebuild the hardware, which has to do with electrical engineering, to start off, and redefine the syntax and rules that govern how it works on a computer.

Python Java C

Prof Hammoud/Cloud

Cloud computing is the practice of using online computing services not directly physically managed by the user. Instead they pay for the cloud services they use as needed. Since cloud computing is maintained over the internet, it is relatively new and different from traditional hosting. At least it is on-demand and cost-efficient, as you are basically remotely renting the resources, not having to maintaining them yourself. The three major cloud service models include Infrastructure as a Service (IaaS), which provides computing power and storage space, Platform as a Service (PaaS), which provides servers and environments for various purposes, and Software as a Service (SaaS), which provides software applications through a browser.

In the real world, Netflix depends most of its computing and storage needs on AWS, entertaining people around the world, the US army adopts Google Cloud to assist in military operations, and a vast majority of health providers rely on Microsoft Azure to manage clinical and medical data. Cloud service providers like those make money from customers' resource utilization, and users typically pay only for the particular resources they consume. So again it's flexible and on-demand.

AWS Azure Google

Prof. Christos/Theory

A decision problem is simply one in computer science that is answered only “yes” or “no”. Such a problem is decidable if and only if there exists an algorithm that always gives a correct answer to the problem for whatever input in a reasonable amount of time. Class P problems refer to those that are relatively easy to solve (solvable in polynomial rather than exponential time), whereas class NP problems refer to those that are easy to verify (also in polynomial time). The classic question P vs NP asks whether P = NP. That is, is every problem quickly verifiable also quickly solvable? No one so far has come up with an established proof. Anyone who figures it out can get a prize of $1 million from the Clay Mathematics Institute.

MIT GfG Lark

Prof. Eduardo/SW Engineering

The software crisis of the 1960s refers to the event that as software projects scaled larger and became increasingly more complex, traditional methods and techniques for small software systems no longer met users' demands, leading to unacceptable poor quality and budget deficits. From there software engineering emerged as a systematic approach to better manage the development of larger projects to guarantee cost-efficiency.

In software engineering, Agile methods are those that are flexible and iterative. People form a general structure, and they can fill in variable features and adjust them as needed. The waterfall model, conversely, follows strictly a plan and has to be conducted step by step in order, allowing limited room for change.

Netflix regularly practices chaos engineering, much like a fire drill, to make its system robust. They randomly cause controlled disruptions and failures in its computer systems to test and develop their resilience and error handing capabilities when confronted with unexpected problems.

Open-source software is one that is free for anyone to use, edit, check, and spread. Big companies like Google, Microsoft, and Amazon through their cloud services contribute to it as it allows open collaboration that at the end of the day benefits everyone. Not only can companies save costs but also more resources can be accessible in the community and can be built on shared knowledge rapidly.

Technical debt refers to the phenomenon that a team rushes their code so that they can release it early yet making it potentially buggy so that will have to fix it later. However, technical debit builds up and will in the end break the software and making it harder and harder to debug, causing even more costs that it could take if they instead debugged it well already from the very beginning.

UCAS MTU GfG

Prof. Gianni/Rob&AI

Artificial intelligence is in my opinion simply a man-made human being with full consciousness and cognition, capable of performing all human tasks (perhaps even better than humans themselves). AI has been around for very long but was distant from real life until recently, largely due to rise of computing power Nvidia GPUs now enable, but still I believe what are present now are preliminary weak AI.

Current challenges AI have include not enough data for training, biases and inaccuracies present in data, and inefficient algorithm or more practically even stronger computational power. Robots are not currently smart enough so they are almost exclusively for extreme tasks like space exploration, nuclear disposal, and warfare. Lastly, I believe real AGI and android are still far from reality but if achieved could of course replace basically everything we have now.

IBM Google TchTrgt