Skip to content

Conversation

frankier
Copy link

Fixes #6924

When mapping regions of memory to Julia, typically it is modeled as an array including an extent. Currently, this was always VECTOR_SIZE = 2048, however child elements of array-like data structures can have data with larger extents. This commit uses the actual extent for these variable sized data structures.

@frankier
Copy link
Author

frankier commented Apr 12, 2023

brb with tests (EDIT: done)

@frankier frankier force-pushed the fix-julia-big-arrays branch from f95872b to e99192e Compare April 13, 2023 07:47
@frankier frankier marked this pull request as ready for review April 13, 2023 07:48
@frankier frankier force-pushed the fix-julia-big-arrays branch from e99192e to 3a38289 Compare April 13, 2023 08:09
When mapping regions of memory to Julia, typically it is modelled as an
array including an extent. Currently, this was always VECTOR_SIZE =
2048, however child elements of array-like data structures can have data
with larger extents. This commit uses the actual extent for these
variable sized data structures.
@frankier frankier force-pushed the fix-julia-big-arrays branch from 3a38289 to d975649 Compare April 13, 2023 09:17
@Mytherin Mytherin merged commit e624e70 into duckdb:master Apr 13, 2023
@Mytherin
Copy link
Collaborator

Thanks for the PR! LGTM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Converting result set with column with cumulative array size of >2048 (= VECTOR_SIZE) to Julia DataFrame causes bounds error
2 participants