Math for computer achitecture/engineer

Hi I hope you are having good day.

I am currently trying to expand my very limited math knowledge
particularly the topics relevant to my interests.
I was wondering what math is required to be able to do the following

1 design a digital calculator without using CAD tools
2 design PDA’s
3 design CPU’s
4 understand the material in computer engineering books.

I am not looking to become a mathematician, I only need the math that is absolutely relevant to each of the above topics.

Thanks. Any help will be greatly appreciated.

Start plugging transistors together, make a logic gate.
Learn some boolean arithmitic, make a calculator

Seriously, doing this stuff without CAD is a nightmare. Ever since computers were first invented they have been used to … make better computers!

We had a project for an ELEC cource where we had to build a computer on top of a FPGA. Even with CAD, building a system that could read user input in binary from a bank of switches and output Hexadecimal on some 7-segment displays was a term-long project. And that was with the ability to code in VHDL (which is not as high level as they’d like to believe) to simplify various operations. To do it without the benifit of a computer doing the fan-in/fan-out calculations and actual transistor routing, it would me a multi-year project.

In terms of math:

  • Differentiation and Integration (for understanding fan-in/out and why stray capacitance and inductance can ruin everything)
  • Boolean Logic and arithmatic (as a challenge design an adder made out of XOR gates). Lots and lots of this.
  • Thermodynamics so you can understand why are hitting limits packing more transistors into smaller spaces
  • Relativity so you can figure out why 4ghz is about the limit short of changing to alternate technologies
  • Quantum mechanics, so you can understand why you can’t get below a certain lithography size because electrons are strange

What are you actually trying to achieve? I’d be suprised if designing your own chips from the ground up is the way to do it. Go out and buy someone elses processor (the arm cortex series are pretty standard for small-scale systems), learn about how registers work and learn some bare-metal C (not the C you use for building software) and possibly assembler. Find out about i2c and SPI. Get frustrated with gdb and stlinkv2. What were you trying to achieve again?

Side note: Current computer systems are man-millenia of work. To try and repeat it would not be possible in a single human lifetime even if they did nothing else.

Another thought: Math is simple. you can simplify it all down to boolean logic, and from there work your way up to addition, multiplication (addition lots of times), integration (multiplication, division and addition lots of times with infinitely small amounts), an on to multivariable calculus. Yeah, simple.

Yet another thought: Take a university course, at least to second year math. Take a ELEC course, probably to third year level. There is a reason engineering is a many-year degree. Once you have done two or three years of engineering at university you will realise two things:

  1. How complex everything is
  2. That you can do anything with enough time and brainpower
  3. Your brainpower and time isn’t enough as an individual
  4. Everything costs money

Final:

  1. Digital calculator: possible by multi-year and hard unless you bring in parts (in which case it’s easy). Oh, do you have a nano-fabrication lab? Theoretically I could do this with a bunch of discrete components, but never would.
  2. PDA’s: You’ll have to bring in other people processors and screens for this if you don’t want to set up whole production lines. I hope your brain can hold a lot more transistors routing in it than mine can. It is possible for people to design and build their own tablets. Theoretically I have the knowledge for this, but would never bother. It is more in-line with my interests though
  3. CPU’s: Do a PHD, go work for intel. My 3-year (so far) uni education is so far under what is necessary for current CPU design.
  4. Anyone can read books, understanding them is the hard part. After 3 years at uni I can pick up most advanced books (on anything sciency/engineeringy (not law or economics)) and if I put in enough effort, understand some of it. To actually understand all of any specific topic would require a year or two dedicated to that topic.

Thanks I really appreciate the help.

My current situation does not afford me the ability to get a formal education so university is out of the question (disappointment all round).
Even so I am trying to gain the skills I need by using online material.
I probably won’t build the CPU and PDA.s but want the theoretical knowledge to be able to do so at least on paper. Then maybe I could build the calculator and improve upon those skills.
For instance I have had a laptop with a dead(the computer still has some lights but never turns on) motherboard for about three years, so instead of replacing the motherboard I want to find the problem area and repair it if possible. stuff like that.

I am a total nube so I guess I will start by repairing old game controllers and stuff any suggestions on how to start.
I know stuff such as V = IR and some other basics. any books and sites would be of great help.

Thing is I understand to fix computers (by that I mean finding the problem area and replacing it not repairing but I can’t fix a dvd player( I kind of worked my way from the front backwards)

Thanks! Any Help will be greatly appreciated.

There was an extensive resource for math literature but publishers saw the money on the table. :frowning:

start with Khan university and MIT offers free online classes I think as well.

Thanks everyone I really appreciate the Help.

Repairing computers is quite different from building CPU’s from scratch. It is easily achievable with no tertiary education.
These days most computers are repairable only in the ‘modular’ sense. You have to throw bits (like ram if it dies) out and then buy replacement parts.