-
-
Notifications
You must be signed in to change notification settings - Fork 210
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update to 2017 Census #171
Comments
We could do that. Wanna send a pull request? |
Sure! Do you have examples how to convert to vega topo json format somewhere? |
One thing to keep in mind is what examples and test specifications in Vega and/or Vega-Lite may be using a dataset before we make any updates. Will it break existing test cases? In the case of geo data, might the inclusion of different ids/regions break joins (lookups) with other (also "outdated") data sets? While Vega keeps a separate folder of data sets for test cases to guard against breakage due to vega-datasets updates, the online examples do pull from vega-datasets and so could be subject to breakage. For example, the unemployment choropleth might break with updated data to I'm (in general) supportive of using more recent examples, but I want to make sure we are keeping these downstream effects in mind. Adding new files with timestamps is probably preferable to overwriting existing files. Thanks! |
Perhaps timestamp the file names ( |
Yes, it will be better to call it us10m_2017.json. The FIPS change regularly. Do you have an example how to convert? |
I’d prefer not to have multiple versions of the same data unless it causes issues with the existing examples. |
@montyvesselinov I want to cut a 2.0 release and include this update. I think the only way forward is to also include an updates unemployment dataset (and include a year such as |
Closing as we are moving to v2 now. |
Any plans to update us-10m to represent the latest county FIPS.
https://cdn.jsdelivr.net/npm/us-atlas@2/us/10m.json
The text was updated successfully, but these errors were encountered: