Cassandra with C#: How to insert bulk over a million records of SQL into Cassandra with high performance -


how can convert rows dataset/datatable of sql rowset of cassandra using c#?

using cassandra prepared statement bulk insert gives ff. error:

index outside bounds of array. values parameter have 100+ values. batch.add(usertrackstmt.bind(values));

i'd say, tools can used fast import of million records depend of sql complexity.
if have issues bulk insert (you haven't provided examples of code , table structures not here).
try:
1. copy command.
2. use spark streaming api using [mobius] (https://github.com/microsoft/mobius). technically, read first table (or sql result), , write data second table in streaming way.


Comments

Popular posts from this blog

commonjs - How to write a typescript definition file for a node module that exports a function? -

openid - Okta: Failed to get authorization code through API call -

thorough guide for profiling racket code -