Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

GPU are optimized for number crunching. Do they get used at all for string processing? I ask because I develop data wrangling software and most of it is string processing (joins, concatenations, aggregations, filtering etc), rather than numerical.


Do you have millions of strings that need to be manipulated in the same way at the same time?


Yes. For example, you might want to change a column of 10 million strings from upper case to lower case. Or concatenate 2 columns to create a third column. It is not clear to me this would be any faster on a GPU.


Also, you might want to create a hash table from a million values in a column, so you can use this for a join.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: