hbase - How do I make spark write to multiple sources foreachrdd -


i have made spark streaming program reads kafka source. after transformations, need send data 2 kafka producers , hbase.

i receive data via spark streaming:

customer1   21631512435 2   1449540003.803  1449540363.571  25566530    27670   1557041 19491   65664   1   197.26.8.142    197.31.74.208 customer2   21631526589 4   1449540003.821  1339540363.565  25536520    27369   1545811 19487   65659   5   197.25.2.135    197.31.74.206 

i want make transformations , send 2 kafka producers , save copy hbase.

i found examples here talking sending data kafka producers , saving hbase, problem don't have sbt or maven , i'm using spark shell (spark 1.3). i've found many problems importing jars.

i'm reading kafka , saving hdfs. can me complete task?


Comments

Popular posts from this blog

php - Permission denied. Laravel linux server -

google bigquery - Delta between query execution time and Java query call to finish -

python - Pandas two dataframes multiplication? -