All Questions
1,997 questions
2
votes
1
answer
82
views
Python Vectorized Mask Generation (Numpy) [closed]
I have an arbitrary Matrix M which is (N x A). I have a column vector V (N x 1) which has on each row the amount of entries I would like to keep from the original M <= A (starting from the leftmost)...
5
votes
1
answer
104
views
Is there a Numpy method or function to split an array of uint64 into two arrays of uint32
Say I have an array as follows:
arr = np.asarray([1, 2, 3, 4294967296, 100], dtype=np.uint64)
I now want two arrays, one array with the lower 32 bits of every element, and one with the upper 32 bits ...
1
vote
1
answer
90
views
Creating a Adjecency matrix fast for use in path finding to calculate gird distance from origin
Goal
I'm working on a project that requires me visualize how far a given budget can reach on a map with different price zones.
Example with 3 price zones, water, countryside, and city, each having ...
6
votes
2
answers
318
views
Is there a way to compute only the real part of a NumPy matmul?
Let's say I have two arrays a and b both with dtype np.complex128 and I want to compute C = np.matmul(a, b).real .
That is, I don't care about the imaginary part, only the real part. Is there a better ...
9
votes
2
answers
1k
views
Surprising lack of speedup in caching numpy calculations
I need to do a lot of calculations on numpy arrays, with some of the calculations being repeated. I had the idea of caching the results, but observe that
In most cases, the cached version is slower ...
3
votes
1
answer
173
views
How to accelerate the cross-correlation computation of two 2D matrices in Python?
I am using Python to compute the cross-correlation of two 2D matrices, and I have implemented three different methods. Below is my experimental code along with their execution times:
import numpy as ...
2
votes
1
answer
116
views
How to explain pandas higher performances compared to numpy with 500k+ rows?
In some sources, I found that pandas works faster than numpy with 500k rows or more. Can someone explain this to me?
Pandas have a better performance when the number of rows is 500K or more.
— ...
0
votes
0
answers
109
views
Optimizing turning integer vectors into binary vectors
I have a bunch of 4 million integer vectors that I have to convert to binary. My code is as follows:
def integer_vectors_to_binary(data, bits=16):
bin_arr = []
for arr in tqdm(data, desc="...
3
votes
3
answers
175
views
What is the most efficient way to randomly pick one positive location within a large binary mask image in Python?
I am writing a custom image data loading function to randomly crop part of a large image according to its binary mask. The function will be used in PyTorch dataloader so I want it to be as fast and ...
0
votes
1
answer
63
views
Generalizing a gaussian mix to take any number of arguments with numpy.vectorize causes performance issues
I am optimizing a gaussian mix using maximum likelyhood estimation. Originally I used the following model:
def normal(x, mu, sigma):
"""
Gaussian (normal) probability density ...
1
vote
1
answer
66
views
How to efficiently compute and process 3x3x3 voxel neighborhoods in a 3D NumPy array?
I am working on a function to process 3D images voxel-by-voxel. For each voxel, I compute the difference between the voxel value and its 3x3x3 neighborhood, apply distance-based scaling, and determine ...
0
votes
1
answer
72
views
Bottleneck using np.where (in term of computing ressources i.e. memory and speed)
In the current example, I spent (many) hours in order to find a way to sort the M matrix so that it corresponds exactly to the target one.
The 2 first columns are used to reorganize M
The matrixOrder ...
2
votes
2
answers
90
views
Efficiently Removing an Integer from a Matrix of Row-Wise Permutations of Integers
I have an n x n matrix where each row is a permutation of the integers from 1 to n.
For a given integer k in {1,2,...,n}, my goal is to locate k in each row and remove it, such that the remaining ...
-1
votes
1
answer
58
views
Is there an efficient way to update / replace a specific value of a dask array in python?
So I have a dask array of integers (1 x 8192) and I want to find an efficient way to replace a specific value.
This is the code I am currently using, which is very slow, because dask is immutable, so ...
2
votes
4
answers
208
views
Improving execution time of analytical ray tracing algorithm
Background
I've written a Python class that is designed to compute the time taken for a ray of light to propagate between two points (init_point and term_point) in a complex medium (modeled using the ...