Computational Thinking and Problem Sovling

Chapter 50 – Thinking Logically, thinking concurrently

structured approach

3 basic programming structures:

Structured programming approach aims to improve the clarity and maintainability of programs, and only the three control structures are allowed to use in block-strucutred languages

Tools for desining algorithms

Thinking concurrently

parallel computing: multiple processors each executing different instructions simultaneously

benefits

  1. Speedup
  2. Performance Scaling
  3. High Throughput
  4. Real-time Processing

trade-offs

  1. Synchronization Overhead
  2. Communication Overhead
  3. Limited Applicability

concurent processing: several processes are running, with each in turn being given a slice of processor time.

benefits

  1. Optimizes resource usage.
  2. Efficiently shares resources among tasks.
  3. Isolates faults, improving reliability.
  4. Facilitates scalability with distributed tasks.

trade-offs

  1. Complexity
  2. Increased Memory Usage
  3. Difficulty in Testing

Chapter 51 – Problem recognition

Computable problems

What can be computed

Methods of problem solving

Strategies for problem solving

Divide and conquer reduces the size of the problem with every iteration binary search

Automation deals with building and putting into action models to solve problems.

Chapter 52 – Problem solving

Visualisation

Computers work with binary numbers but humans often prefer a visual image. For example, we can represent the binary tree below in a diagram way. visualisation:

Backtracking

In some problems, in order to find a solution you have to make a series of decisions, but there may be cases for which:

  1. you dont have enough information to know which one is the best choice
  2. each decision leads to a new set of choices
  3. one or more of the sequences may be a solution to the problem
Backtracking is a methodical way of trying out different sequences until you find one that leads to a solution. It is the technique used in a depth-first traversal of a graph.

Data mining

Data mining is the process of digging through big data sets to discover hidden connections and predict future trends, typically involving the use of different kinds of software packages such as analytics tools. Big data is the term used for large sets of data that cannot be easily handled in a traditional database

Interactable problems

although an algorithm may exist for their solution, it would take an unreasonably long time to find the solution.

Comparing time complexities

The table below shows what a huge difference there is in algorithms with different orders of time complexity for different values of n.

heuristic methods

An approach to problem solving which employs an algorithm or methodology not guaranteed to be optimal or perfect, but is sufficient for the purpose

Performance modelling

Performance modelling is the process of simulating different user and system loads on a computer using a mathematicl approximation, rather than doing actual performance testing which may be difficlt and expensive.

Pipelining

Pipelining is the technique of splitting tasks into smaller parts and overlapping the processing of each part of the task.