*signal_cellular_null*Pre-Calculus

##### Trigonometry

The three "primary" trigonometric functions are:

The three "secondary" trigonometric functions are:

Here a visual way of understanding:

**sine**: \(\sin(\theta)=\dfrac{\text{opp}}{\text{hyp}}=\dfrac{y}{r}\)

Thinking about \(\sin(\theta)\) and \(\cos(\theta)\) as being legs of a right triangle whose hypotenuse is the radius of the unit circle, we recall the **Pythagorean Theorem**, which you may remember as \(A^{2}+B^{2}=C^{2}\), and apply it to trigonometric functions:
\[\sin^{2}\theta+\cos^{2}\theta=1\]
\[\tan^{2}\theta+1=\sec^{2}\theta\]
\[1+\cot^{2}\theta=\csc^{2}\theta\]
This gives us several ways to simplify various trigonometric expressions.

Other trigonometric identities which may be of use are the half-angle formulas and the double angle formulas for sine, cosine, and tangent: \[\sin(\alpha\pm\beta)=\sin\alpha\cos\beta\pm\cos\alpha\sin\beta\] \[\cos(\alpha\pm\beta)=\cos\alpha\cos\beta\mp\sin\alpha\sin\beta\] \[\sin(2\theta)=2\sin\theta\cos\theta\] \[\cos(2\theta)=\cos^{2}\theta-\sin^{2}\theta\] \[\sin^{2}\theta=\frac{1-\cos(2\theta)}{2}\] \[\cos^{2}\theta=\frac{1+\cos(2\theta)}{2}\]

Of course, there are many more trigonometric formulas and identities, and it helps to remember the common right triangles for knowing values of sine and cosine, but these are some common useful formulas.

*multiline_chart*Calculus

##### \(\varepsilon\)-\(\delta\) Definition of a Limit

One of the first ideas in calculus is that of a **limit**, so we want a precise definition of a limit.

We write the limit of \(f(x)\) as \(x\) approaches \(c\) as the number \(L\), below \[L=\lim_{x\to c}f(x)\] This limit exists if for every \(\varepsilon>0\), there exists some \(\delta>0\) such that \[\text{if }|x-c|<\delta,\text{ then }|f(x)-L|<\varepsilon.\] Visually, this means that as we narrow our vertical range of \(f(x)\) near \(L\), we can find a suitable restriction of our domain near \(c\) so that in the interval \((c-\delta,c+\delta)\), the value of \(f(x)\) will live inside \((L-\varepsilon,L+\varepsilon)\), shown here:

##### Continuous Functions

With limits we can define **continuous** functions as those whose limits equal the values of the function, that is, a function is continuous at \(a\) if
\[\lim_{x\to a}f(x)=f(a).\]

We more simply say a function is **continuous** if it is continuous everywhere.

The first big result of continuous functions is the **Intermediate Value Theorem** which implies that continuous functions pass through every intermediate point.

**Intermediate Value Theorem**

Let \(f(x)\) be a continuous function over an interval \([a,b]\), then for every real number \(K\) between \(f(a)\) and \(f(b)\), there exists a \(c\) in \([a,b]\) such that \[f(c)=K.\]

##### Squeeze Theorem

Another useful result is the **Squeeze Theorem**, named for how the theorem forces the middle limit by two limits above and below:

**Squeeze Theorem**

Let \(h(x)\leq f(x)\leq g(x)\) for all \(x\). If
\[\lim_{x\to c}h(x)=L=\lim_{x\to c}g(x),\]
then we also have \[\lim_{x\to a}f(x)=L.\]
##### Limit Definition of a Derivative

Once we have a notion of limits and continuity, we can start understanding the derivative as a very important kind of limit. The derivative is most easily understood to be the instantaneous velocity or the slope of a given function.

For a function \(f(x)\) near a point \(c\), we define the **derivative** at \(c\) written \(f'(c)\) to be the limit (if it exists):
\[f'(c)=\lim_{h\to 0}\frac{f(c+h)-f(c)}{h}\]
which you can think of as \(\Delta y=f(c+h)-f(c)\) over \(\Delta x=c+h-c\) as \(\Delta x=h\) approaches \(0\).

##### Derivative Function

While the limit definition of a derivative is precise, it is unwieldy and often difficult to work with as the limits get more complicated.

Instead, we usually look for a function called the **derivative** which gives us all of the values \(f'(x)\) for any given input \(x\).

Similar to the limit definition of a derivative, we define the derivative function by \[f'(x)=\lim_{h\to 0}\frac{f(x+h)-f(x)}{h}\] and much of the first calculus course is interested in tricks and tools for computing derivatives and their applications.

Visually, the derivative can be thought of as the slope of a tangent line to a function's graph (if it exists) and gives us information such as **critical points** (when \(f'(x)=0\)).

*widgets*Set Theory

##### Sets

A **set** is abstractly defined to be a collection of objects called **elements**.

We usually use uppercase letters for sets and lowercase letters for elements, for example an element \(a\) in the set \(S\) can be written \[a\in S.\]
Sets are not necessarily ordered and we *do not* distinguish between repeated elements.

Some useful examples of numerical sets are:

\(\begin{align*} \mathbb{C}&\text{ : the set of complex numbers}\\ \mathbb{R}&\text{ : the set of real numbers}\\ \mathbb{Q}&\text{ : the set of rational numbers}\\ \mathbb{Z}&\text{ : the set of integer numbers}\\ \mathbb{N}&\text{ : the set of natural numbers}\end{align*}\)

Of course, there are *many* other examples, but the most general way to construct sets is using **set-builder notation** by writing "the set of all \(x\in A\) such that \(x\) satisfies some additional property \(P(x)\)":
\[S=\{x\in A:x\text{ satisfies a property }P(x)\},\]
for example, the set of positive real numbers \(\mathbb{R}_{+}\) can be written as
\[\mathbb{R}_{+}=\{x\in\mathbb{R}:x>0\}.\]

##### Subsets

A **subset** is any set which is "contained" inside a larger set, denoted \(B\subset A\).

We more rigorously say that \(B\subset A\) if for every \(b\in B\), we also have \(b\in A\).

A typical proof of subset inclusion \(B\subset A\) involves taking an arbitrary element \(b\in B\) and showing that \(b\) is also an element of \(A\).

We can also define **set equality** between two sets if both \(B\subset A\) and \(A\subset B\), which is unsurprisingly written \(B=A\).

##### Null Set

There is a special set called the empty set or **null set** which contains no elements, typically denoted \(\varnothing=\{\}\).

A useful fact about the null set is that it is a subset of *every* set, \(\varnothing\subset A\).

##### Set Union and Intersection

We want a kind of "arithmetic" on sets, so that we can combine and interact sets with each other. The first we'll see is the **set union**.

For two sets \(A\) and \(B\), the **union** of these sets is written \(A\cup B\) and defined to be the set of all elements in either \(A\), \(B\), or both.

An interesting result is that if \(B\subset A\), then \(A\cup B=A\).

The next "operation" we will understand is **set intersection**.

For two sets \(A\) and \(B\), the **intersection** of these sets is written \(A\cap B\) and defined to be the set of all elements in both \(A\) and \(B\).

We make the special definition that if \(A\cap B=\varnothing\), then \(A\) and \(B\) are called **disjoint**.

An interesting result is that if \(B\subset A\), then \(A\cap B=B\).

##### Set Difference and Product

Some other set operations are the difference and Cartesian product.

For two sets \(A\) and \(B\), the **difference** \(A-B\) is the set of all elements of \(A\) that are not elements of \(B\).

If \(B\subset A\), then we can also think of this as the **complement** of \(B\) inside of \(A\), sometimes denoted \(B^{\complement}\) if \(A\) is understood to be the "universal set".

The **product** of two sets \(A\) and \(B\) is written \(A\times B\) and is defined to be the set of ordered pairs \((a,b)\) such that \(a\in A\) and \(b\in B\).

In the special case that we have \(A\times A\), we use power notation \(A^{2}\); you can understand this basically how we describe the plane \(\mathbb{R}^{2}=\mathbb{R}\times\mathbb{R}\) as ordered pairs of real numbers \((x,y)\).

##### Power Set

For a set \(X\), the **power set** of \(X\) is written \(\mathcal{P}(X)\) and defined to be the set of all subsets of \(X\).

Visualizing the power set of \(X\) is trickier, especially for very large sets. However, for small sets we can picture the power set by forming a "subset lattice" of points (where each point denotes a subset of \(X\)):

An important notational distinction is that subsets of \(X\) are *not* subsets of the power set; rather, subsets are *elements* of the power set:
\(\begin{align*}
\varnothing&\subset X&\varnothing&\not\subset\mathcal{P}(X)&\varnothing&\in\mathcal{P}(X)\\
A&\subset X& A&\not\subset\mathcal{P}(X)&A&\in\mathcal{P}(X)\\
X&\subset X& X&\not\subset\mathcal{P}(X)&X&\in\mathcal{P}(X)
\end{align*}\)

We define the **cardinality** of a set to be the number of elements in that set, written \(\#X\) or \(|X|\).

For finite sets, such as \(X=\{a,b,c\}\), we can see that \(|\mathcal{P}(X)|=2^{|X|}\) number of elements, but for large sets (possibly infinite) this is not always intuitive. This is where functions and bijections will come in handy for relating the cardinalities of two different sets.

##### Functions

We have already seen functions from Pre-Calculus and Calculus in the form \(y=f(x)\), which can be visualized nicely as graphs,

but we can now understand abstractly in terms of sets:

We can then abstract functions \(f:A\to B\) from a **domain set** \(A\) to a **codomain set** \(B\) by requiring that \(f(a)=b_{1}\) and \(f(a)=b_{2}\) if and only if \(b_{1}=b_{2}\) and every \(a\in A\) is mapped to some \(b\in f(A)\).

We call \(f(a)\) the **image** of \(a\) under \(f\) and we call \(f(A)\subset B\) the **range** or the **image** of \(A\) under \(f\).

A special kind of function worth noting is a **real-valued function** where the codomain is the real numbers \(\mathbb{R}\):

In general, it will be useful to consider **restrictions** of functions, where we restrict the domain of \(f\) to some subset \(U\subset A\), denoted by \(f|_{U}\):

We can also revisit **function composition** in the realm of sets; if \(f:A\to B\) and \(g:B\to C\) are functions, then we can compose these functions to get \(g\circ f:A\to C\) given by \((g\circ f)(a)=g(f(a))\) for any \(a\in A\):

##### Properties of Functions

We are often interested in finding inverses of functions, which helps us solve certain kinds of problems. Functions naturally come with the notion of **preimages**, where the preimage of any element \(b\in B\) is the subset \(f^{-1}(b)\subset A\) such that for any \(a\in f^{-1}(b)\), \(f(a)=b\); this idea naturally extends to subsets of \(B\) as well:

While \(f^{-1}(B)=A\), it is possible that the image of \(A\) under \(f\) is not all of \(B\). In the special case where \(f(A)=B\), we say that the function \(f\) is **surjective** or **onto its image**:

However, all that this means is that every \(b\in B\) has some element \(a\in A\) so that \(f(a)=b\); it could be that multiple elements are mapped to the same \(b\).

In the special case where no two elements of \(A\) are sent to the same element \(b\in B\), we say that the function \(f\) is **injective** or **one-to-one**:

When we have a function \(f\) which is both injective and surjective, we say that \(f\) is **bijective** or a **one-to-one correspondence**:

If this is the case, then our bijective function admits an **inverse function** \(f^{-1}:B\to A\) where \(f^{-1}(b)=a\) if and only if \(f(a)=b\).

*bubble_chart*Topology

##### Set Topology

A **topology** on a set \(X\) is a subset of the power set \(\tau\subset\mathcal{P}(X)\) satisfying:

I. \(\varnothing,X\in\tau\).

II. If \(U_{\alpha}\in\tau\) for all \(\alpha\in A\), then \(\bigcup_{\alpha}U_{\alpha}\in\tau\).

III. If \(U_{i}\in\tau\) for \(i\in\{1,\ldots,n\}\), then \(\bigcap_{i=1}^{n}U_{i}\in\tau\).

We call a set \(X\) endowed with a topology \(\tau\) a **topological space**, sometimes written \((X,\tau)\). We call sets in a topology **open sets**, usually written \(U,V\in\tau\).

Every set always has the following two topologies on it:

**Trivial Topology :** For a set \(X\), the trivial topology \(\tau_{\text{trivial}}=\{\varnothing,X\}\).

**Discrete Topology :** For a set \(X\), the discrete topology \(\tau_{\text{discrete}}=\mathcal{P}(X)\).

##### Continuous Functions

For a function between two topological spaces \[f:(A,\tau_{A})\longrightarrow(B,\tau_{B})\] we say that \(f\) is **continuous** if for every open set \(V\in\tau_{B}\), its preimage \(f^{-1}(V)\) is an open set in \(\tau_{A}\).

**WARNING :** A continuous function only requires the preimage of open sets to be open sets; it *does not* require that the image of open sets are open sets (only go backwards)!

This is an extension of our idea of continuity from Calculus, where the preimage of the open interval \((f(c)-\varepsilon,f(c)+\varepsilon)\) in \(B=\mathbb{R}\) contains an open interval \((c-\delta,c+\delta)\) in \(A=\mathbb{R}\).

##### Separation Axioms

We typically classify topological spaces by how "coarse" or "fine" they are. The Diagram of the Separation Axioms is a more complete exposition of these classifications.

The basic idea of these separation axioms is sometimes objects inside of a topological space can be "separated" by open sets in various ways.

\(T_{0}\) : A topological space \(X\) is \(T_{0}\) or **Kolmogorov** means that for every \(x\neq y\) in \(X\), there exists an open set \(U_{x}\) containing \(x\) which does not contain \(y\) *or* there is an open set \(U_{y}\) containing \(y\) which does not contain \(x\).

\(T_{1}\) : A topological space \(X\) is \(T_{1}\) or **Fréchet** means that for every \(x\neq y\) in \(X\), there exists an open set \(U_{x}\) containing \(x\) but not containing \(y\) and another open set \(U_{y}\) containing \(y\) but not containing \(x\).

\(T_{2}\) : A topological space \(X\) is \(T_{0}\) or **Hausdorff** means that for every \(x\neq y\) in \(X\), there exists disjoint open sets \(U_{x}\cap U_{y}=\varnothing\) where \(U_{x}\) contains \(x\) but not \(y\) and \(U_{y}\) contains \(y\) but not \(x\).

\(T_{2.5}\) : A topological space \(X\) is \(T_{2.5}\) or **Urysohn** means that for every \(x\neq y\) in \(X\), there exists disjoint closed sets \(F_{x}\cap F_{y}=\varnothing\) where \(F_{x}\) contains \(x\) but not \(y\) and \(F_{y}\) contains \(y\) but not \(x\).

\(T_{3}\) : A topological space \(X\) is \(T_{3}\) or **Regular Hausdorff** if for \(x\notin F\subset X\), there exists disjoint open sets \(U_{x}\cap U_{F}=\varnothing\) where \(U_{x}\) contains \(x\) but not \(F\) and \(F\subset U_{F}\) but \(x\notin U_{F}\).

\(T_{4}\) : A topological space \(X\) is \(T_{4}\) or **Normal** if for disjoint \(E\cap F=\varnothing\) in \(X\), there exists disjoint open sets \(U_{E}\cap U_{F}=\varnothing\) where \(E\subset U_{E}\) but not \(F\cap U_{E}=\varnothing\) and \(F\subset U_{F}\) but \(E\cap U_{F}=\varnothing\).

\(T_{5}\) : A topological space \(X\) is \(T_{5}\) or **Completely Normal** if every subspace \(Y\subset X\) is normal.

**Subspaces**, **quotient spaces**, and **product spaces** are further topics of study in topology, which each inherit topological structures from their original corresponding spaces. For an example of a quotient space, consider the unit square \([0,1]\times[0,1]\) with the equivalence that \((x,0)\sim(x,1)\) and \((0,x)\sim(1,x)\) for all \(x\in[0,1]\), then the quotient space \[T^{2}=[0,1]\times[0,1]/\sim\]can be visualized as a **torus** by "gluing" the edges as shown below: