Sunday, March 26, 2023
Okane Pedia
No Result
View All Result
  • Home
  • Technology
    • Information Technology
  • Artificial Intelligence
  • Cyber Security
  • Mobile News
  • Robotics
  • Virtual Reality
  • Home
  • Technology
    • Information Technology
  • Artificial Intelligence
  • Cyber Security
  • Mobile News
  • Robotics
  • Virtual Reality
No Result
View All Result
Okane Pedia
No Result
View All Result

Two-Dimensional Tensors in Pytorch – MachineLearningMastery.com

Okanepedia by Okanepedia
November 20, 2022
in Artificial Intelligence
0
Home Artificial Intelligence


RELATED POST

Fractal Geometry in Python | by Robert Elmes | Medium

Allow absolutely homomorphic encryption with Amazon SageMaker endpoints for safe, real-time inferencing

Final Up to date on November 15, 2022

Two-dimensional tensors are analogous to two-dimensional metrics. Like a two-dimensional metric, a two-dimensional tensor additionally has $n$ variety of rows and columns.

Let’s take a gray-scale picture for instance, which is a two-dimensional matrix of numeric values, generally often called pixels. Starting from ‘0’ to ‘255’, every quantity represents a pixel depth worth. Right here, the bottom depth quantity (which is ‘0’) represents black areas within the picture whereas the best depth quantity (which is ‘255’) represents white areas within the picture. Utilizing the PyTorch framework, this two-dimensional picture or matrix may be transformed to a two-dimensional tensor.

Within the earlier put up, we discovered about one-dimensional tensors in PyTorch and utilized some helpful tensor operations. On this tutorial, we’ll apply these operations to two-dimensional tensors utilizing the PyTorch library. Particularly, we’ll study:

  • Learn how to create two-dimensional tensors in PyTorch and discover their varieties and shapes.
  • About slicing and indexing operations on two-dimensional tensors intimately.
  • To use a variety of strategies to tensors akin to, tensor addition, multiplication, and extra.

Let’s get began.

Two-Dimensional Tensors in Pytorch
Image by dylan dolte. Some rights reserved.

Tutorial Overview

This tutorial is split into elements; they’re:

  • Varieties and shapes of two-dimensional tensors
  • Changing two-dimensional tensors into NumPy arrays
  • Changing pandas sequence to two-dimensional tensors
  • Indexing and slicing operations on two-dimensional tensors
  • Operations on two-dimensional tensors

Varieties and Shapes of Two-Dimensional Tensors

Let’s first import a number of needed libraries we’ll use on this tutorial.

import torch

import numpy as np

import pandas as pd

To examine the categories and shapes of the two-dimensional tensors, we’ll use the identical strategies from PyTorch, launched beforehand for one-dimensional tensors. However, ought to it work the identical approach it did for the one-dimensional tensors?

Let’s reveal by changing a 2D record of integers to a 2D tensor object. For instance, we’ll create a 2D record and apply torch.tensor() for conversion.

example_2D_list = [[5, 10, 15, 20],

                   [25, 30, 35, 40],

                   [45, 50, 55, 60]]

list_to_tensor = torch.tensor(example_2D_list)

print(“Our New 2D Tensor from 2D Checklist is: “, list_to_tensor)

Our New 2D Tensor from 2D Checklist is:  tensor([[ 5, 10, 15, 20],

        [25, 30, 35, 40],

        [45, 50, 55, 60]])

As you may see, the torch.tensor() technique additionally works nicely for the two-dimensional tensors. Now, let’s use form(), dimension(), and ndimension() strategies to return the form, dimension, and dimensions of a tensor object.

print(“Getting the form of tensor object: “, list_to_tensor.form)

print(“Getting the dimensions of tensor object: “, list_to_tensor.dimension())

print(“Getting the scale of tensor object: “, list_to_tensor.ndimension())

print(“Getting the form of tensor object: “, list_to_tensor.form)

print(“Getting the dimensions of tensor object: “, list_to_tensor.dimension())

print(“Getting the scale of tensor object: “, list_to_tensor.ndimension())

Changing Two-Dimensional Tensors to NumPy Arrays

PyTorch permits us to transform a two-dimensional tensor to a NumPy array after which again to a tensor. Let’s learn the way.

# Changing two_D tensor to numpy array

 

twoD_tensor_to_numpy = list_to_tensor.numpy()

print(“Changing two_Dimensional tensor to numpy array:”)

print(“Numpy array after conversion: “, twoD_tensor_to_numpy)

print(“Information kind after conversion: “, twoD_tensor_to_numpy.dtype)

 

print(“***************************************************************”)

 

# Changing numpy array again to a tensor

 

back_to_tensor = torch.from_numpy(twoD_tensor_to_numpy)

print(“Changing numpy array again to two_Dimensional tensor:”)

print(“Tensor after conversion:”, back_to_tensor)

print(“Information kind after conversion: “, back_to_tensor.dtype)

Changing two_Dimensional tensor to numpy array:

Numpy array after conversion:  [[ 5 10 15 20]

[25 30 35 40]

[45 50 55 60]]

Information kind after conversion:  int64

***************************************************************

Changing numpy array again to two_Dimensional tensor:

Tensor after conversion: tensor([[ 5, 10, 15, 20],

        [25, 30, 35, 40],

        [45, 50, 55, 60]])

Information kind after conversion:  torch.int64

Changing Pandas Sequence to Two-Dimensional Tensors

Equally, we are able to additionally convert a pandas DataFrame to a tensor. As with the one-dimensional tensors, we’ll use the identical steps for the conversion. Utilizing values attribute we’ll get the NumPy array after which use torch.from_numpy that lets you convert a pandas DataFrame to a tensor.

Right here is how we’ll do it.

# Changing Pandas Dataframe to a Tensor

 

dataframe = pd.DataFrame({‘x’:[22,24,26],‘y’:[42,52,62]})

 

print(“Pandas to numpy conversion: “, dataframe.values)

print(“Information kind earlier than tensor conversion: “, dataframe.values.dtype)

 

print(“***********************************************”)

 

pandas_to_tensor = torch.from_numpy(dataframe.values)

print(“Getting new tensor: “, pandas_to_tensor)

print(“Information kind after conversion to tensor: “, pandas_to_tensor.dtype)

Pandas to numpy conversion:  [[22 42]

[24 52]

[26 62]]

Information kind earlier than tensor conversion:  int64

***********************************************

Getting new tensor:  tensor([[22, 42],

        [24, 52],

        [26, 62]])

Information kind after conversion to tensor:  torch.int64

Indexing and Slicing Operations on Two-Dimensional Tensors

For indexing operations, totally different parts in a tensor object may be accessed utilizing sq. brackets. You may merely put corresponding indices in sq. brackets to entry the specified parts in a tensor.

Within the under instance, we’ll create a tensor and entry sure parts utilizing two totally different strategies. Word that the index worth ought to all the time be one lower than the place the factor is positioned in a two-dimensional tensor.

example_tensor = torch.tensor([[10, 20, 30, 40],

                               [50, 60, 70, 80],

                               [90, 100, 110, 120]])

print(“Accessing factor in 2nd row and 2nd column: “, example_tensor[1, 1])

print(“Accessing factor in 2nd row and 2nd column: “, example_tensor[1][1])

 

print(“********************************************************”)

 

print(“Accessing factor in third row and 4th column: “, example_tensor[2, 3])

print(“Accessing factor in third row and 4th column: “, example_tensor[2][3])

Accessing factor in 2nd row and 2nd column:  tensor(60)

Accessing factor in 2nd row and 2nd column:  tensor(60)

********************************************************

Accessing factor in third row and 4th column:  tensor(120)

Accessing factor in third row and 4th column:  tensor(120)

What if we have to entry two or extra parts on the similar time? That’s the place tensor slicing comes into play. Let’s use the earlier instance to entry first two parts of the second row and first three parts of the third row.

example_tensor = torch.tensor([[10, 20, 30, 40],

                               [50, 60, 70, 80],

                               [90, 100, 110, 120]])

print(“Accessing first two parts of the second row: “, example_tensor[1, 0:2])

print(“Accessing first two parts of the second row: “, example_tensor[1][0:2])

 

print(“********************************************************”)

 

print(“Accessing first three parts of the third row: “, example_tensor[2, 0:3])

print(“Accessing first three parts of the third row: “, example_tensor[2][0:3])

example_tensor = torch.tensor([[10, 20, 30, 40],

                               [50, 60, 70, 80],

                               [90, 100, 110, 120]])

print(“Accessing first two parts of the second row: “, example_tensor[1, 0:2])

print(“Accessing first two parts of the second row: “, example_tensor[1][0:2])

 

print(“********************************************************”)

 

print(“Accessing first three parts of the third row: “, example_tensor[2, 0:3])

print(“Accessing first three parts of the third row: “, example_tensor[2][0:3])

Operations on Two-Dimensional Tensors

Whereas there are a whole lot of operations you may apply on two-dimensional tensors utilizing the PyTorch framework, right here, we’ll introduce you to tensor addition, and scalar and matrix multiplication.

Including Two-Dimensional Tensors

Including two tensors is just like matrix addition. It’s fairly a straight ahead course of as you merely want an addition (+) operator to carry out the operation. Let’s add two tensors within the under instance.

A = torch.tensor([[5, 10],

                  [50, 60],

                  [100, 200]])

B = torch.tensor([[10, 20],

                  [60, 70],

                  [200, 300]])

add = A + B

print(“Including A and B to get: “, add)

Including A and B to get:  tensor([[ 15,  30],

        [110, 130],

        [300, 500]])

Scalar and Matrix Multiplication of Two-Dimensional Tensors

Scalar multiplication in two-dimensional tensors can be an identical to scalar multiplication in matrices. For example, by multiplying a tensor with a scalar, say a scalar 4, you’ll be multiplying each factor in a tensor by 4.

new_tensor = torch.tensor([[1, 2, 3],

                           [4, 5, 6]])

mul_scalar = 4 * new_tensor

print(“results of scalar multiplication: “, mul_scalar)

results of scalar multiplication:  tensor([[ 4,  8, 12],

        [16, 20, 24]])

Coming to the multiplication of the two-dimensional tensors, torch.mm() in PyTorch makes issues simpler for us. Just like the matrix multiplication in linear algebra, variety of columns in tensor object A (i.e. 2×3) should be equal to the variety of rows in tensor object B (i.e. 3×2).

A = torch.tensor([[3, 2, 1],

                  [1, 2, 1]])

B = torch.tensor([[3, 2],

                  [1, 1],

                  [2, 1]])

A_mult_B = torch.mm(A, B)

print(“multiplying A with B: “, A_mult_B)

multiplying A with B:  tensor([[13,  9],

        [ 7,  5]])

Additional Studying

Developed concurrently TensorFlow, PyTorch used to have a less complicated syntax till TensorFlow adopted Keras in its 2.x model. To study the fundamentals of PyTorch, it’s possible you’ll wish to learn the PyTorch tutorials:

Particularly the fundamentals of PyTorch tensor may be discovered within the Tensor tutorial web page:

There are additionally fairly a number of books on PyTorch which might be appropriate for newcomers. A extra just lately revealed guide must be advisable because the instruments and syntax are actively evolving. One instance is

Abstract

On this tutorial, you discovered about two-dimensional tensors in PyTorch.

Particularly, you discovered:

  • Learn how to create two-dimensional tensors in PyTorch and discover their varieties and shapes.
  • About slicing and indexing operations on two-dimensional tensors intimately.
  • To use a variety of strategies to tensors akin to, tensor addition, multiplication, and extra.



Source_link

ShareTweetPin

Related Posts

Fractal Geometry in Python | by Robert Elmes | Medium
Artificial Intelligence

Fractal Geometry in Python | by Robert Elmes | Medium

March 26, 2023
Allow absolutely homomorphic encryption with Amazon SageMaker endpoints for safe, real-time inferencing
Artificial Intelligence

Allow absolutely homomorphic encryption with Amazon SageMaker endpoints for safe, real-time inferencing

March 25, 2023
March 20 ChatGPT outage: Right here’s what occurred
Artificial Intelligence

March 20 ChatGPT outage: Right here’s what occurred

March 25, 2023
What Are ChatGPT and Its Pals? – O’Reilly
Artificial Intelligence

What Are ChatGPT and Its Pals? – O’Reilly

March 25, 2023
MobileOne: An Improved One millisecond Cellular Spine
Artificial Intelligence

MobileOne: An Improved One millisecond Cellular Spine

March 24, 2023
Utilizing JAX to speed up our analysis
Artificial Intelligence

Utilizing JAX to speed up our analysis

March 24, 2023
Next Post
Anticipate to Be Destroyed – Make:

Anticipate to Be Destroyed - Make:

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Popular News

  • Elephant Robotics launched ultraArm with varied options for schooling

    Elephant Robotics launched ultraArm with varied options for schooling

    0 shares
    Share 0 Tweet 0
  • iQOO 11 overview: Throwing down the gauntlet for 2023 worth flagships

    0 shares
    Share 0 Tweet 0
  • Rule 34, Twitter scams, and Fb fails • Graham Cluley

    0 shares
    Share 0 Tweet 0
  • The right way to use the Clipchamp App in Home windows 11 22H2

    0 shares
    Share 0 Tweet 0
  • Specialists Element Chromium Browser Safety Flaw Placing Confidential Information at Danger

    0 shares
    Share 0 Tweet 0

ABOUT US

Welcome to Okane Pedia The goal of Okane Pedia is to give you the absolute best news sources for any topic! Our topics are carefully curated and constantly updated as we know the web moves fast so we try to as well.

CATEGORIES

  • Artificial Intelligence
  • Cyber Security
  • Information Technology
  • Mobile News
  • Robotics
  • Technology
  • Virtual Reality

RECENT NEWS

  • Nvidia GTC 2023 Spotlights AI Prospects for Operational Effectivity
  • Hosting4OpenSim opens for enterprise, already internet hosting 4 grids – Hypergrid Enterprise
  • The most effective Apple Watch faces
  • Detection of methanol utilizing a smooth photonic crystal robotic
  • Home
  • About Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Sitemap
  • Terms and Conditions

Copyright © 2022 Okanepedia.com | All Rights Reserved.

No Result
View All Result
  • Home
  • Technology
    • Information Technology
  • Artificial Intelligence
  • Cyber Security
  • Mobile News
  • Robotics
  • Virtual Reality

Copyright © 2022 Okanepedia.com | All Rights Reserved.