wiki home
Online Reference
Dictionary, Encyclopedia & more
by:

photo gallery

structures:matrix:applications

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
structures:matrix:applications [2025/02/24 08:35]
simon [working with matrices]
structures:matrix:applications [2025/03/03 10:52] (current)
simon [working with matrices]
Line 1: Line 1:
-======Applications======+======Matrices details======
 Maybe this whole discussion is hinting why HSC Physics is kind of lacking. The teachers cannot even depend on students knowing calculus, maths is not a co-requisite,​ and yet even extension 2 maths only gives a taste of some of the language required to build modern physics. But it is not just Physics. The mathematical structures I will give an extremely abbreviated overview of are fundamental to analysis of all kinds not just engineering,​ and to computation and design. Maybe this whole discussion is hinting why HSC Physics is kind of lacking. The teachers cannot even depend on students knowing calculus, maths is not a co-requisite,​ and yet even extension 2 maths only gives a taste of some of the language required to build modern physics. But it is not just Physics. The mathematical structures I will give an extremely abbreviated overview of are fundamental to analysis of all kinds not just engineering,​ and to computation and design.
  
-Matrices are everywhere. They can be operators transforming vectors or shapes in cartesian space. GPUs, the '​graphics cards' that run the screens on your devices are just very parallel matrix calculators,​ they do matrix operations fast and do many at the same time. Photoshop or whatever graphics applications you use do matrix operations constantly. Solving a set of [[#linear algebra|linear simultaneous equations]],​ like we do in year 8 in high school, is basically the original motivation for matrices. That problem has been an inspiration for maths going way way back, one of the earliest mathematics --- but not in europe, europe focussed on formal geometry instead and eventually imported ​this stuff. The coefficients form a matrix where each row is an equation and the steps we do by hand are matrix operations. A wide range of practical problems in business and [[#​planning]] are, if you frame them that right way, combinations of those kinds of relationships and we are looking for the best outcome, the highest or lowest overall value. This is called [[#​optimisation]],​ and in its simplest form it is something we study in high school.+Matrices are everywhere. They can be operators transforming vectors or shapes in cartesian space. GPUs, the '​graphics cards' that run the screens on your devices are just very parallel matrix calculators,​ they do matrix operations fast and do many at the same time. Photoshop or whatever graphics applications you use do matrix operations constantly. Solving a set of [[#linear algebra|linear simultaneous equations]],​ like we do in year 8 in high school, is basically the original motivation for matrices. That problem has been an inspiration for maths going way way back, one of the earliest mathematics --- but not in europe, europe focussed on formal geometry instead and eventually imported ​algebra and the core numerical algorithms. Matrices came later. The coefficients form a matrix where each row is an equation and the steps we do by hand are [[#linear algebra|matrix operations]]. A wide range of practical problems in business and [[#​planning]] are, if you frame them that right way, combinations of those kinds of relationships and we are looking for the best outcome, the highest or lowest overall value. This is called [[#​optimisation]],​ and in its simplest form it is something we study in high school.
  
-A supercomputer,​ with thousands of those matrix calculating GPUs repurposed for more general matrix work, runs simulations with varied assumptions using matrix operations and finds the best or average of the ones it tries by iterating, by stepping in little jumps through time. But more detail makes this very much a longer calculation,​ a supercomputer cannot manage to cope when we try more detail. Now we use trained Neural Nets and are just right now getting Quantum Computers going. But both of these are much more sophisticated and intense ... matrix calculations!!! This time much further abstracted from step by step logic.+A supercomputer,​ with thousands of those matrix calculating GPUs repurposed for more general matrix work, runs simulations with varied assumptions using matrix operations and finds the best or average of the ones it tries by iterating, by stepping in little jumps through time. But more detail makes this very much a longer calculation,​ a supercomputer cannot manage to cope when we try more detail. Now we use trained Neural Nets and are just right now getting Quantum Computers going. But both of these are much more sophisticated and intense ... matrix calculations!!! This time much further abstracted from step by step logic, they can no longer be described as made up of parts like the examples below, or as representing this or that transformation in some other context.
  
 ====working with matrices==== ====working with matrices====
-So far we know a [[discussion#​matrices|matrix]] is a table of [[discussion#​scalar]]s (often Real Numbers), rows and columns sometimes representing data or perhaps representing a list of [[discussion#​vectors]]. But perhaps it is just a matrix in its own right. I have mentioned we can add them and multiply them by a value from their scalars, and that they can represent the transformations we have studied in cartesian [[discussion#​space]] (rotation, stretching etc). I also mentioned that [[discussion#​Complex Numbers]] can fit in here in an intriguing way. So here are some details.+So far we know a [[discussion#​matrices|matrix]] is a table of [[discussion#​scalar]]s (often Real Numbers), rows and columns sometimes representing data or perhaps representing a list of [[discussion#​vectors]]. But perhaps it is just a matrix in its own right. I have mentioned we can add them and multiply them by a value from their scalars, and that they can represent the transformations we have studied in cartesian [[discussion#​space]] (rotation, stretching etc). I also mentioned that [[discussion#​Complex Numbers]] can fit in here in [[discussion#​complex-as-matrix|an intriguing way]]. So here are some details.
  
 Any matrix can be multiplied by its type of scalar, multiply each element individually:​ Any matrix can be multiplied by its type of scalar, multiply each element individually:​
Line 41: Line 41:
  
 This is well defined for an \(m{\Tiny\times}n\) matrix `A` and an `m` dimensional row vector \(\bf m\) or an `n` dimensional column vector \(\bf n\). So \({\bf m}A\) gives a new `n` dimensional row vector. `A` maps any vector in an `m`-space to a new vector in `n`-space. Or the vector might represent a point in cartesian space. \(\ A\bf n\ \) likewise maps a column vector in `m`-space to `n`-space. This is well defined for an \(m{\Tiny\times}n\) matrix `A` and an `m` dimensional row vector \(\bf m\) or an `n` dimensional column vector \(\bf n\). So \({\bf m}A\) gives a new `n` dimensional row vector. `A` maps any vector in an `m`-space to a new vector in `n`-space. Or the vector might represent a point in cartesian space. \(\ A\bf n\ \) likewise maps a column vector in `m`-space to `n`-space.
- 
-<WRAP notify> 
-We are starting to see here how **[[#​Space]],​ [[#​Scalar]],​ [[#​Vectors]]** and **[[#​Matrices]]** fit together. We have a vector in `n`-space \(\bf v\). We can describe it as an arrow, a magnitude with a direction. We can describe the same entity as a list of scalars `v_i`. In another interpretation it is a single column or single row matrix `V` and can participate in matrix operations. Each of these representations can also describe a position in `n`-dimensional geometrical space. Position in one dimensional space can be described by a scalar, often a Real Number. Direction here is simply positive or negative, forward or back. That is our very familiar number line --- we need to pick a '​zero'​ point and also a '​one'​ point which gives the scalar its '​unit'​. Then we can use numbers to describe position. Complex Numbers, another kind of scalar, do not fit on a line. They do not have that kind of '​order'​. But they do behave like numbers in other ways, \({+}{\times}{=}{1}{\div}{0}{-}\) all behave properly. \(\ a^b\sin\log\ \) and number algebra are better behaved than Real Numbers. They have this extra number, \(\ \rm i,\,\) we can place it distance `1` from '​zero'​ perpendicular to the number line. We pick that point \(\ \rm i\ \) on a plane and we see the Complex Numbers as all the points on a Number Plane. Now we have a magnitude and direction sense of these numbers. Direction here is angle rather than sign. `pi` and \(\rm e\) emerge as fundamental to this number system, linking its geometrical and component representations. 
- 
-A scalar which we can also see as 2D. Now things get interesting. We are pulling together algebra, geometry and trigonometry,​ creating a single framework. It turns out that nature is quite happy with this kind of quantity, several values we can measure make a great deal of sense as Complex Numbers. But that does challenge our geometrical intuition, and we need to develop our mathematical language to support this. And to support the fact that space and time are not simple, flat and independent in the way our simple geometrical intuition derived from our senses and human-scale experience might suggest. Here matrices become very important. 
-</​WRAP>​ 
  
 Put another way `A` multiplied by a vector is a function --- with any point (or vector) in `m`-space the domain, the codomain is in `n`-space. Multiplying matrices is composing these functions. So if `B` is an \(n{\Tiny\times}p\) matrix and \(\bf p\) a vector in `p`-space then consider: Put another way `A` multiplied by a vector is a function --- with any point (or vector) in `m`-space the domain, the codomain is in `n`-space. Multiplying matrices is composing these functions. So if `B` is an \(n{\Tiny\times}p\) matrix and \(\bf p\) a vector in `p`-space then consider:
Line 61: Line 55:
  
 The second might seem a more natural notation, fitting better with our function notation. The first fits more with the mapping notation. The second might seem a more natural notation, fitting better with our function notation. The first fits more with the mapping notation.
 +
 +Matrix multiplication is the composition of the linear maps they represent. Matrices written \(\texttt{rows}\times\texttt{cols},​\) so index is \( (\texttt{vert},​\texttt{horiz})\),​ ordered as in left to right, top row first writing. Note \(A{\bf v}\) transforms rows-dimensional space to cols-dimensional space with \(\bf v\) written vertically. Consider example above, `AB=C` if `B` had been a column vector: \({\Tiny\left[\matrix{3\cr0\cr1}\right]}\mapsto\Tiny\left[\matrix{11\cr1}\right].\) Columns on the left must match rows on the right. \(B^{n\times1}\) is a vector.
  
 ===square matrices=== ===square matrices===
 Square matrices behave in a more straightforward way, especially considered with a vector or cartesian space of the same dimension. Two square matrices of the same dimension multiply to make another of the same dimension, the set of `n`-dimensional matrices is closed under multiplication. There is an Identity under this multiplication,​ the square matrix with all ones on the main diagonal and zeros elsewhere. This multiplication is not commutative. They can be multiplied by a vector of the same dimension, considered as a single row matrix on the left or a single column matrix on the right. The result is the same kind of vector. This means they define **transformations** in the corresponding cartesian or vector space. They tie together several very important mathematical [[discussion#​structures]] mentioned in the discussion [[discussion#​draft for matrices discussion|here]] ​ Square matrices behave in a more straightforward way, especially considered with a vector or cartesian space of the same dimension. Two square matrices of the same dimension multiply to make another of the same dimension, the set of `n`-dimensional matrices is closed under multiplication. There is an Identity under this multiplication,​ the square matrix with all ones on the main diagonal and zeros elsewhere. This multiplication is not commutative. They can be multiplied by a vector of the same dimension, considered as a single row matrix on the left or a single column matrix on the right. The result is the same kind of vector. This means they define **transformations** in the corresponding cartesian or vector space. They tie together several very important mathematical [[discussion#​structures]] mentioned in the discussion [[discussion#​draft for matrices discussion|here]] ​
  
-Consider first \(2{\times}2\) matrices.  ​+I talk of transformations,​ especially with 2\({\times}\)2 matrices, here: [[discussion#​linear transformations with matrices]].  ​
  
 =====Planning===== =====Planning=====
structures/matrix/applications.1740346539.txt.gz ยท Last modified: 2025/02/24 08:35 by simon