Alina Khay

Alina Khay

Share this post

Alina Khay
Alina Khay
Role of Tensors and Linear Algebra in GPU Architectures

Role of Tensors and Linear Algebra in GPU Architectures

Linear algebra, deep learning, and GPU architectures are fundamentally interlinked.

Alina Khay's avatar
Alina Khay
Aug 28, 2022
∙ Paid

Share this post

Alina Khay
Alina Khay
Role of Tensors and Linear Algebra in GPU Architectures
Share

Linear algebra, deep learning, and GPU architectures are fundamentally interlinked.

Linear algebra, the math of matrices and tensors, forms the foundation of deep learning. GPUs, initially tailored for graphics, have evolved into stellar tools for executing linear algebra operations.

Mathematics of Tensors

Tensors, the stars of the show, are versatile arrays that can handle everything from basic numbers to complex matrices. They unite data, weights, biases, and the numerical soul of neural networks.

0th-Order Tensor: A solo player, like a single number.

1st-Order Tensor: A team of numbers, marching in order, a.k.a. a vector.

2nd-Order Tensor: A grid of numbers, a matrix with rows and columns.

Higher-Order Tensor: Imagine 3D and beyond, generalizing vectors, and matrices.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Alina
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share