我试图从Google平台下载一个BigQuery数据集到R工作区,以便使用以下代码对其进行分析:
library(bigrquery)
library(DBI)
library(tidyverse)
library(dplyr)
con = dbConnect(
bigquery(),
project = "bigquery-public-data",
dataset = "new_york_citibike",
billing = "maanan-bigquery-in-r"
)
bigrquery::bq_auth()
my_db_pointer = tbl(con, "citibike_trips")
glimpse(my_db_pointer)
count(my_db_pointer)
selected = select(my_db_pointer, everything()) %>% collect()但是,当我试图运行最后一行以下载数据时,它将返回以下错误:
Complete
Billed: 0 B
Downloading first chunk of data.
Received 55,308 rows in the first chunk.
Downloading the remaining 58,882,407 rows in 1420 chunks of (up to) 41,481 rows.
Downloading data [=====>--------------------------------------------------------------------------------------------------] 6% ETA: 19m
Error in `signal_reason()`:
! Exceeded rate limits: Your project:453562790213 exceeded quota for tabledata.list bytes per second per project. For more information, see https://cloud.google.com/bigquery/troubleshooting-errors [rateLimitExceeded]
ℹ Try increasing the `page_size` value of `bq_table_download()`
Run `rlang::last_error()` to see where the error occurred.如果有人能帮我修复这个错误并下载数据,我将非常感激。我需要分析数据集。提前谢谢你。
发布于 2022-03-17 10:12:56
根据关于rateLimitExceeded的文档链接,您似乎打破了查询作业的阈值。
请考虑以下几点:
bigquery api是否具有在执行操作时可能破坏的设置限制和配额。要查看您当前的配额和限制,请转到IAM & Admin >配额>项目配额"projectid“> bigquery.google.apis.com55,308 rows的58,882,407 rows,所以您似乎正在尝试下载更多它所允许的数据,并且您可能会遇到以下限制:Query/script execution-time limit、Maximum response size、Maximum row size。operations per day的那个。https://stackoverflow.com/questions/71494858
复制相似问题