GPU are optimized for number crunching. Do they get used at all for string processing? I ask because I develop data wrangling software and most of it is string processing (joins, concatenations, aggregations, filtering etc), rather than numerical.
Yes. For example, you might want to change a column of 10 million strings from upper case to lower case. Or concatenate 2 columns to create a third column. It is not clear to me this would be any faster on a GPU.