Matrix multiply an array of points

slehar shared this question 7 years ago
Answered

I created an array of random 3D points with the command


    pts = Sequence[ (RandomUniform[-3,3],RandomUniform[-3,3],RandomUniform[-3,3]),i,1,20]


This produced a list...


    pts = {(-0.71, 1.42, 1.08), (-2.69, 0.24, -2.23), (-0.35, 2.42, 2.03), (-0.25, -0.94, -0.49), (-0.42, -1.49, 0.32), (0.2, -2.98, -2.84), (-1.49, 0.63, 0), (-0.4, -2.57, 0.61), (1.06, -2.96, 0.1), (-1.19, -1.2, 0.11), (1.36, 0.28, -2.82), (0.77, -0.33, -2.56), (-2.66, 1.8, 1.65), (-1.46, -0.35, -2.78), (0.44, -2.62, -2.46), (0.23, -1.34, -0.73), (0.11, -0.49, -0.77), (-1.89, 0.42, -0.29), (2.4, -1.42, -2.44), (-1.6, -2.91, 2.83)}


I also created a matrix M of the form


    M = {{2,0,0},{0,1,0},{0,0,1}}


I would like to multiply each point by the matrix to create a set of shifted points, so I tried


    shiftedpts = M * pts


but the result was just an empty list shiftedpts = {}


Why does that not work? Can this be fixed to work as expected?

Comments (2)

photo
1

hello

try ApplyMatrix[M, pts]

saludos

photo
1

That did the trick!


Muchos gracias! You guys are great! And Geogebra is AWESOME!

© 2022 International GeoGebra Institute