postgresql Postgres 按时间戳分组到 6 小时桶

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/22157718/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-21 01:20:24  来源:igfitidea点击:

Postgres group by timestamp into 6 hourly buckets

sqlpostgresql

提问by user636322

I have the following simple table:

我有以下简单的表:

ID      TIMESTAMP               VALUE
4   2011-05-27 15:50:04 1253
5   2011-05-27 15:55:02 1304
6   2011-05-27 16:00:02 1322
7   2011-05-27 16:05:01 1364

I would like to average the VALUES, and GROUPeach TIMESTAMPday into 6 hourly buckets. e.g 00:00 to 06:00, 06:00 to 12:00, 12:00 to 18:00 & 18:00 to 00:00.

我想了平均VALUES,并且GROUPTIMESTAMP一天为6个小时桶。例如 00:00 到 06:00、06:00 到 12:00、12:00 到 18:00 和 18:00 到 00:00。

I am able to group by year, month, day & hour using the following query:

我可以使用以下查询按年、月、日和小时分组:

select avg(VALUE),
  EXTRACT(year from TIMESTAMP) AS year,
  EXTRACT(month from TIMESTAMP) AS month,
  EXTRACT(day from TIMESTAMP) as day
    from TABLE
      group by year,month,day

But I am unable to group each day into 4 periods as defined above, any help is most welcome.

但是我无法将每天分为 4 个如上所述的时间段,非常欢迎任何帮助。

回答by user2989408

I think grouping the integer value of the quotient of the (Hour of your timestamp / 6) should help. Try it and see if it helps. Your group by should be something like

我认为对(时间戳的小时数/ 6)的商的整数值进行分组应该会有所帮助。尝试一下,看看它是否有帮助。你的 group by 应该是这样的

group by year, month, day, trunc(EXTRACT(hour from TIMESTAMP) / 6)

The logic behind this is that when the hour part of the date is divided by 6, the int values can only be

这背后的逻辑是,当日期的小时部分除以 6 时,int 值只能是

    0 - 0:00 - 5:59:59
    1 - 6:00 - 11:59:59
    2 - 12:00 - 17:59:59
    3 - 18:00 - 23:59:59

Grouping using this should put your data into 4 groups per day, which is what you need.

使用它进行分组应该每天将您的数据分成 4 组,这正是您所需要的。