Riak Spark Connector 1.6.3 Released

Alex Moore amoore at basho.com
Fri Mar 17 10:21:59 EDT 2017


Hi Everyone,

Version 1.6.3 of the Riak Spark Connector has been released, and is now
available through both GitHub
<https://github.com/basho/spark-riak-connector/releases/tag/v1.6.3> and Spark
Packages <https://spark-packages.org/package/basho/spark-riak-connector>.
I'd like to thank the team, especially Sergey Galkin
<https://github.com/srgg>, for all their hard work on this release.

This release fixes some issues, and enhances some features:


   - Enhancement - Data locality support for Coverage Plan Based
   Partitioning #230
   <https://github.com/basho/spark-riak-connector/pull/230>
   - Enhancement - Improve Coverage Plan Based Partitioning: smart split
   calculation and more accurate coverage entry distribution across the
   partitions #231 <https://github.com/basho/spark-riak-connector/pull/231>
   - Critical Fix - Python serialization for empty JSON objects #226
   <https://github.com/basho/spark-riak-connector/pull/226>
   - Fix - Remove double filtering for DataFrames #228
   <https://github.com/basho/spark-riak-connector/pull/228>


As always, issues and PRs are welcome via the project's GitHub page
<https://github.com/basho/spark-riak-connector>.

Thanks,
Alex Moore
Clients Team Lead, Basho
amoore at basho.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.basho.com/pipermail/riak-users_lists.basho.com/attachments/20170317/0ad29764/attachment-0002.html>


More information about the riak-users mailing list