从命令行检查 Parquet

Inspect Parquet from command line

如何从命令行检查 Parquet 文件的内容?

我现在看到的唯一选择是

$ hadoop fs -get my-path local-file
$ parquet-tools head local-file | less

我愿意

  1. 避免创建 local-file
  2. 将文件内容视为 json 而不是 parquet-tools 打印的无类型文本。

有没有简单的方法?

我建议只为您的 Hadoop 发行版构建 运行 parquet-tools.jar。

检查 github 项目:https://github.com/apache/parquet-mr/tree/master/parquet-tools

hadoop jar ./parquet-tools-<VERSION>.jar <command>

默认parquet-tools一般会查找本地文件目录,所以要指向hdfs,需要在开头加上hdfs://的文件路径。所以在你的情况下,你可以做这样的事情

parquet-tools head hdfs://localhost/<hdfs-path> | less

我遇到了同样的问题,对我来说效果很好。无需先下载本地文件

您可以将 parquet-tools 与命令 cat--json 选项一起使用,以便查看没有本地副本且格式为 JSON 的文件。

这是一个例子:

parquet-tools cat --json hdfs://localhost/tmp/save/part-r-00000-6a3ccfae-5eb9-4a88-8ce8-b11b2644d5de.gz.parquet

这将以 JSON 格式打印出数据:

{"name":"gil","age":48,"city":"london"}
{"name":"jane","age":30,"city":"new york"}
{"name":"jordan","age":18,"city":"toronto"}

免责声明:已在 Cloudera CDH 5.12.0 中测试

我宁愿使用 hdfs NFS 网关 + autofs 来轻松调查 hdfs 文件。

我的设置:

  • HDFS NFS 网关服务运行在名称节点上。
  • 分发捆绑的 autofs 服务。对 auto.master
  • 进行了以下配置更改
/net    -hosts nobind

我可以很容易地运行按照命令调查任何 hdfs 文件

head /net/<namenodeIP>/path/to/hdfs/file
parquet-tools head /net/<namenodeIP>/path/to/hdfs/par-file
rsync -rv /local/directory/ /net/<namenodeIP>/path/to/hdfs/parentdir/

忘记 hadoop* hdfs* 命令 ;)

在您的 Mac 上安装自制程序(参见 https://brew.sh/),然后:

brew install parquet-tools

完成后,您可以在命令行中使用 parquet-tools 二进制文件(现在应该在您的路径中)执行各种命令。

parquet-toolsparquet-tools -h 将为您提供使用信息。

示例:

> parquet-tools rowcount part-00000-fc34f237-c985-4ebc-822b-87fa446f6f70.c000.snappy.parquet 
Total RowCount: 148192
> parquet-tools head -n 1 part-00000-fc34f237-c985-4ebc-822b-87fa446f6f70.c000.snappy.parquet 
:created_at = 2019-02-28T00:16:06.329Z
:id = row-wive~i58u-qaeu
:updated_at = 2019-02-28T00:16:06.329Z
agency = 1
body_style = PA
color = GY
fine_amount = 63
issue_date = 17932
issue_time = 1950
latitude = 64379050
location = 12743 DAVENTRY
longitude = 19261609
make = HYDA
marked_time = 
meter_id = 
plate_expiry_date = 18048
route = 16X2
rp_state_plate = CA
ticket_number = 1020798376
vin = 
violation_code = 22502A#
violation_description = 18 IN. CURB/2 WAY
> parquet-tools meta part-00000-fc34f237-c985-4ebc-822b-87fa446f6f70.c000.snappy.parquet 
file:                  file:/Users/matthewropp/team_demo/los-angeles-parking-citations/raw_citations/issue_month=201902/part-00000-fc34f237-c985-4ebc-822b-87fa446f6f70.c000.snappy.parquet 
creator:               parquet-mr version 1.10.0 (build 031a6654009e3b82020012a18434c582bd74c73a) 
extra:                 org.apache.spark.sql.parquet.row.metadata = {"type":"struct","fields":[{"name":":created_at","type":"string","nullable":true,"metadata":{}},{"name":":id","type":"string","nullable":true,"metadata":{}},{"name":":updated_at","type":"string","nullable":true,"metadata":{}},{"name":"agency","type":"integer","nullable":true,"metadata":{}},{"name":"body_style","type":"string","nullable":true,"metadata":{}},{"name":"color","type":"string","nullable":true,"metadata":{}},{"name":"fine_amount","type":"integer","nullable":true,"metadata":{}},{"name":"issue_date","type":"date","nullable":true,"metadata":{}},{"name":"issue_time","type":"integer","nullable":true,"metadata":{}},{"name":"latitude","type":"decimal(8,1)","nullable":true,"metadata":{}},{"name":"location","type":"string","nullable":true,"metadata":{}},{"name":"longitude","type":"decimal(8,1)","nullable":true,"metadata":{}},{"name":"make","type":"string","nullable":true,"metadata":{}},{"name":"marked_time","type":"string","nullable":true,"metadata":{}},{"name":"meter_id","type":"string","nullable":true,"metadata":{}},{"name":"plate_expiry_date","type":"date","nullable":true,"metadata":{}},{"name":"route","type":"string","nullable":true,"metadata":{}},{"name":"rp_state_plate","type":"string","nullable":true,"metadata":{}},{"name":"ticket_number","type":"string","nullable":false,"metadata":{}},{"name":"vin","type":"string","nullable":true,"metadata":{}},{"name":"violation_code","type":"string","nullable":true,"metadata":{}},{"name":"violation_description","type":"string","nullable":true,"metadata":{}}]} 

file schema:           spark_schema 
--------------------------------------------------------------------------------
:                      created_at: OPTIONAL BINARY O:UTF8 R:0 D:1
:                      id: OPTIONAL BINARY O:UTF8 R:0 D:1
:                      updated_at: OPTIONAL BINARY O:UTF8 R:0 D:1
agency:                OPTIONAL INT32 R:0 D:1
body_style:            OPTIONAL BINARY O:UTF8 R:0 D:1
color:                 OPTIONAL BINARY O:UTF8 R:0 D:1
fine_amount:           OPTIONAL INT32 R:0 D:1
issue_date:            OPTIONAL INT32 O:DATE R:0 D:1
issue_time:            OPTIONAL INT32 R:0 D:1
latitude:              OPTIONAL INT32 O:DECIMAL R:0 D:1
location:              OPTIONAL BINARY O:UTF8 R:0 D:1
longitude:             OPTIONAL INT32 O:DECIMAL R:0 D:1
make:                  OPTIONAL BINARY O:UTF8 R:0 D:1
marked_time:           OPTIONAL BINARY O:UTF8 R:0 D:1
meter_id:              OPTIONAL BINARY O:UTF8 R:0 D:1
plate_expiry_date:     OPTIONAL INT32 O:DATE R:0 D:1
route:                 OPTIONAL BINARY O:UTF8 R:0 D:1
rp_state_plate:        OPTIONAL BINARY O:UTF8 R:0 D:1
ticket_number:         REQUIRED BINARY O:UTF8 R:0 D:0
vin:                   OPTIONAL BINARY O:UTF8 R:0 D:1
violation_code:        OPTIONAL BINARY O:UTF8 R:0 D:1
violation_description: OPTIONAL BINARY O:UTF8 R:0 D:1

row group 1:           RC:148192 TS:10503944 OFFSET:4 
--------------------------------------------------------------------------------
:                      created_at:  BINARY SNAPPY DO:0 FPO:4 SZ:607/616/1.01 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: 2019-02-28T00:16:06.329Z, max: 2019-03-02T00:20:00.249Z, num_nulls: 0]
:                      id:  BINARY SNAPPY DO:0 FPO:611 SZ:2365472/3260525/1.38 VC:148192 ENC:BIT_PACKED,PLAIN,RLE ST:[min: row-2229_y75z.ftdu, max: row-zzzs_4hta.8fub, num_nulls: 0]
:                      updated_at:  BINARY SNAPPY DO:0 FPO:2366083 SZ:602/611/1.01 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: 2019-02-28T00:16:06.329Z, max: 2019-03-02T00:20:00.249Z, num_nulls: 0]
agency:                 INT32 SNAPPY DO:0 FPO:2366685 SZ:4871/5267/1.08 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: 1, max: 58, num_nulls: 0]
body_style:             BINARY SNAPPY DO:0 FPO:2371556 SZ:36244/61827/1.71 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: , max: WR, num_nulls: 0]
color:                  BINARY SNAPPY DO:0 FPO:2407800 SZ:111267/111708/1.00 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: , max: YL, num_nulls: 0]
fine_amount:            INT32 SNAPPY DO:0 FPO:2519067 SZ:71989/82138/1.14 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: 25, max: 363, num_nulls: 63]
issue_date:             INT32 SNAPPY DO:0 FPO:2591056 SZ:20872/23185/1.11 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: 2019-02-01, max: 2019-02-27, num_nulls: 0]
issue_time:             INT32 SNAPPY DO:0 FPO:2611928 SZ:210026/210013/1.00 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: 1, max: 2359, num_nulls: 41]
latitude:               INT32 SNAPPY DO:0 FPO:2821954 SZ:508049/512228/1.01 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: 99999.0, max: 6513161.2, num_nulls: 0]
location:               BINARY SNAPPY DO:0 FPO:3330003 SZ:1251364/2693435/2.15 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,PLAIN,RLE ST:[min: , max: ZOMBAR/VALERIO, num_nulls: 0]
longitude:              INT32 SNAPPY DO:0 FPO:4581367 SZ:516233/520692/1.01 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: 99999.0, max: 1941557.4, num_nulls: 0]
make:                   BINARY SNAPPY DO:0 FPO:5097600 SZ:147034/150364/1.02 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: , max: YAMA, num_nulls: 0]
marked_time:            BINARY SNAPPY DO:0 FPO:5244634 SZ:11675/17658/1.51 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: , max: 959.0, num_nulls: 0]
meter_id:               BINARY SNAPPY DO:0 FPO:5256309 SZ:172432/256692/1.49 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: , max: YO97, num_nulls: 0]
plate_expiry_date:      INT32 SNAPPY DO:0 FPO:5428741 SZ:149849/152288/1.02 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: 2000-02-01, max: 2099-12-01, num_nulls: 18624]
route:                  BINARY SNAPPY DO:0 FPO:5578590 SZ:38377/45948/1.20 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: , max: WTD, num_nulls: 0]
rp_state_plate:         BINARY SNAPPY DO:0 FPO:5616967 SZ:33281/60186/1.81 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: AB, max: XX, num_nulls: 0]
ticket_number:          BINARY SNAPPY DO:0 FPO:5650248 SZ:801039/2074791/2.59 VC:148192 ENC:BIT_PACKED,PLAIN ST:[min: 1020798376, max: 4350802142, num_nulls: 0]
vin:                    BINARY SNAPPY DO:0 FPO:6451287 SZ:64/60/0.94 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: , max: , num_nulls: 0]
violation_code:         BINARY SNAPPY DO:0 FPO:6451351 SZ:94784/131071/1.38 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: 000, max: 8942, num_nulls: 0]
violation_description:  BINARY SNAPPY DO:0 FPO:6546135 SZ:95937/132641/1.38 VC:148192 ENC:BIT_PACKED,PLAIN_DICTIONARY,RLE ST:[min: , max: YELLOW ZONE, num_nulls: 0]
> parquet-tools dump -m -c make part-00000-fc34f237-c985-4ebc-822b-87fa446f6f70.c000.snappy.parquet | head -20
BINARY make 
--------------------------------------------------------------------------------
*** row group 1 of 1, values 1 to 148192 *** 
value 1:      R:0 D:1 V:HYDA
value 2:      R:0 D:1 V:NISS
value 3:      R:0 D:1 V:NISS
value 4:      R:0 D:1 V:TOYO
value 5:      R:0 D:1 V:AUDI
value 6:      R:0 D:1 V:MERC
value 7:      R:0 D:1 V:LEX
value 8:      R:0 D:1 V:BMW
value 9:      R:0 D:1 V:GMC
value 10:     R:0 D:1 V:HOND
value 11:     R:0 D:1 V:TOYO
value 12:     R:0 D:1 V:NISS
value 13:     R:0 D:1 V:
value 14:     R:0 D:1 V:THOR
value 15:     R:0 D:1 V:DODG
value 16:     R:0 D:1 V:DODG
value 17:     R:0 D:1 V:HOND

如果您正在使用 HDFS,以下命令非常有用,因为它们经常被使用(留在这里以供将来参考):

hadoop jar parquet-tools-1.9.0.jar schema hdfs://path/to/file.snappy.parquet
hadoop jar parquet-tools-1.9.0.jar head -n5 hdfs://path/to/file.snappy.parquet

我发现这个程序非常有用: https://github.com/chhantyal/parquet-cli

让您无需安装整个基础架构即可查看 parquet 文件。

只需输入:

pip install parquet-cli
parq input.parquet --head 10

在 Windows 10 x64 上,我刚刚从源代码中构建了 parquet-reader

Windows 10 + WSL + GCC

使用 Ubuntu LTS 18.04 安装 WSL。将 gcc 升级到 v9.2.1 并将 CMake 升级到最新版本。奖励:安装 Windows 终端。

git checkout https://github.com/apache/arrow
cd arrow
cd cpp
mkdir buildgcc
cd buildgcc
cmake .. -DPARQUET_BUILD_EXECUTABLES=ON -DARROW_PARQUET=ON -DARROW_WITH_SNAPPY=ON -DARROW_WITH_BROTLI=ON -DPARQUET_BUILD_EXAMPLES=ON -DARROW_CSV=ON
make -j 20
cd release
./parquet-reader
Usage: parquet-reader [--only-metadata] [--no-memory-map] [--json] [--dump] [--print-key-value-metadata] [--columns=...] <file>

如果构建有问题,可能必须使用 vcpkg 来弥补缺失的库。

另请参阅另一个提供较少但更简单的解决方案:https://github.com/chhantyal/parquet-cli

链接自:

最初尝试 brew install parquet-tools,但这在我安装的 WSL 下似乎不起作用

Windows 10 + MSVC

同上。使用CMake生成Visual Studio 2019项目,然后build.

git checkout https://github.com/apache/arrow
cd arrow
cd cpp
mkdir buildmsvc
cd buildmsvc
cmake .. -DPARQUET_BUILD_EXECUTABLES=ON -DARROW_PARQUET=ON -DARROW_WITH_SNAPPY=ON -DARROW_WITH_BROTLI=ON -DPARQUET_BUILD_EXAMPLES=ON -DARROW_CSV=ON
# Then open the generated .sln file in MSVC and build. Everything should build perfectly.

疑难解答:

如果有任何缺失的库,我将其指向我安装的 vcpkg。我运行vcpkg integrate install,然后复制到CMake行的末尾:

-DCMAKE_TOOLCHAIN_FILE=[...path...]/vcpkg/scripts/buildsystems

如果它抱怨缺少任何库,我会安装这些库,例如boost,等等,使用像 vcpkg install boost:x64.

这样的命令

在 Windows 10 x64 上,尝试 Parq:

choco install parq

这会将所有内容安装到当前目录中。您必须手动将此目录添加到路径中,或从该目录中添加 运行 parq.exe

我的另一个答案是 parquet-reader 从源代码构建的。这个实用程序看起来和它的工作差不多。

实际上,我发现 pandas 已经支持 parquet 文件,只要你安装了 pyarrow 或 fastparquet 作为它的后端。查看 read_parquet:

import pandas as pd

df = pd.read_parquet('your-file.parquet')

df.head(10)
...

上一个回答: 聚会可能会迟到,但我刚刚了解到 pyarrow 已经支持读取 parquet,而且它非常强大。很可能你已经安装了 pyarrow 和 pandas,所以你可以像这样阅读 parquet

from pyarrow import parquet
import pandas

p = parquet.read_table('/path/to/your/xxxxx.parquet')
df = p.to_pandas()

df.head(10)
...

如果你使用 Docker 你也可以这样做:

docker run -ti -v C:\file.parquet:/tmp/file.parquet nathanhowell/parquet-tools cat /tmp/file.parquet

如果其他人来这里寻找一种从命令行检查 parquet 文件的简单方法,我编写了工具 clidb 来执行此操作。

它不会像 OP 想要的那样生成 json,而是将镶木地板数据显示为 table,并允许 SQL 片段作为 运行 反对它。它应该适用于:

pip install "clidb[extras]"
clidb path/with/data