我有一个很大的数据集,出于这个问题的目的,它有 3 个字段:
在任何给定行上,
From Date
始终小于 To Date
,但在每组内,由日期对表示的时间段(没有特定顺序)可以重叠、包含在另一个中,甚至可以是完全相同的。
我希望最终得到一个查询,将每个组的结果压缩为连续的周期。例如,一个看起来像这样的组:
| Group ID | From Date | To Date |
--------------------------------------
| A | 01/01/2012 | 12/31/2012 |
| A | 12/01/2013 | 11/30/2014 |
| A | 01/01/2015 | 12/31/2015 |
| A | 01/01/2015 | 12/31/2015 |
| A | 02/01/2015 | 03/31/2015 |
| A | 01/01/2013 | 12/31/2013 |
会导致这样的结果:
| Group ID | From Date | To Date |
--------------------------------------
| A | 01/01/2012 | 11/30/2014 |
| A | 01/01/2015 | 12/31/2015 |
我读过很多关于日期打包的文章,但我不太清楚如何将其应用到我的数据集。
如何构造一个查询来给我这些结果?
《Microsoft® SQL Server® 2012 使用窗口函数的高性能 T-SQL》一书的解决方案
;with C1 as(
select GroupID, FromDate as ts, +1 as type, 1 as sub
from dbo.table_name
union all
select GroupID, dateadd(day, +1, ToDate) as ts, -1 as type, 0 as sub
from dbo.table_name),
C2 as(
select C1.*
, sum(type) over(partition by GroupID order by ts, type desc
rows between unbounded preceding and current row) - sub as cnt
from C1),
C3 as(
select GroupID, ts, floor((row_number() over(partition by GroupID order by ts) - 1) / 2 + 1) as grpnum
from C2
where cnt = 0)
select GroupID, min(ts) as FromDate, dateadd(day, -1, max(ts)) as ToDate
from C3
group by GroupID, grpnum;
创建表:
if object_id('table_name') is not null
drop table table_name
create table table_name(GroupID varchar(100), FromDate datetime,ToDate datetime)
insert into table_name
select 'A', '01/01/2012', '12/31/2012' union all
select 'A', '12/01/2013', '11/30/2014' union all
select 'A', '01/01/2015', '12/31/2015' union all
select 'A', '01/01/2015', '12/31/2015' union all
select 'A', '02/01/2015', '03/31/2015' union all
select 'A', '01/01/2013', '12/31/2013'
我会使用
Calendar
桌子。这张表只是列出了几十年的日期。
CREATE TABLE [dbo].[Calendar](
[dt] [date] NOT NULL,
CONSTRAINT [PK_Calendar] PRIMARY KEY CLUSTERED
(
[dt] ASC
))
有很多方法可以填充此类表。
例如,1900-01-01 的 100K 行(约 270 年):
INSERT INTO dbo.Calendar (dt)
SELECT TOP (100000)
DATEADD(day, ROW_NUMBER() OVER (ORDER BY s1.[object_id])-1, '19000101') AS dt
FROM sys.all_objects AS s1 CROSS JOIN sys.all_objects AS s2
OPTION (MAXDOP 1);
一旦您拥有了
Calendar
表,以下是如何使用它。
每个原始行都与
Calendar
表连接,以返回与“从”到“到”之间的日期一样多的行。
然后删除可能的重复项。
然后通过按两个序列对行进行编号来实现经典的间隙和岛屿。
然后将找到的岛屿分组在一起以获得新的“从”和“到”。
样本数据
我添加了第二组。
DECLARE @T TABLE (GroupID int, FromDate date, ToDate date);
INSERT INTO @T (GroupID, FromDate, ToDate) VALUES
(1, '2012-01-01', '2012-12-31'),
(1, '2013-12-01', '2014-11-30'),
(1, '2015-01-01', '2015-12-31'),
(1, '2015-01-01', '2015-12-31'),
(1, '2015-02-01', '2015-03-31'),
(1, '2013-01-01', '2013-12-31'),
(2, '2012-01-01', '2012-12-31'),
(2, '2013-01-01', '2013-12-31');
查询
WITH
CTE_AllDates
AS
(
SELECT DISTINCT
T.GroupID
,CA.dt
FROM
@T AS T
CROSS APPLY
(
SELECT dbo.Calendar.dt
FROM dbo.Calendar
WHERE
dbo.Calendar.dt >= T.FromDate
AND dbo.Calendar.dt <= T.ToDate
) AS CA
)
,CTE_Sequences
AS
(
SELECT
GroupID
,dt
,ROW_NUMBER() OVER(PARTITION BY GroupID ORDER BY dt) AS Seq1
,DATEDIFF(day, '2001-01-01', dt) AS Seq2
,DATEDIFF(day, '2001-01-01', dt) -
ROW_NUMBER() OVER(PARTITION BY GroupID ORDER BY dt) AS IslandNumber
FROM CTE_AllDates
)
SELECT
GroupID
,MIN(dt) AS NewFromDate
,MAX(dt) AS NewToDate
FROM CTE_Sequences
GROUP BY GroupID, IslandNumber
ORDER BY GroupID, NewFromDate;
结果
+---------+-------------+------------+
| GroupID | NewFromDate | NewToDate |
+---------+-------------+------------+
| 1 | 2012-01-01 | 2014-11-30 |
| 1 | 2015-01-01 | 2015-12-31 |
| 2 | 2012-01-01 | 2013-12-31 |
+---------+-------------+------------+
; with
cte as
(
select *, rn = row_number() over (partition by [Group ID] order by [From Date])
from tbl
),
rcte as
(
select rn, [Group ID], [From Date], [To Date], GrpNo = 1, GrpFrom = [From Date], GrpTo = [To Date]
from cte
where rn = 1
union all
select c.rn, c.[Group ID], c.[From Date], c.[To Date],
GrpNo = case when c.[From Date] between r.GrpFrom and dateadd(day, 1, r.GrpTo)
or c.[To Date] between r.GrpFrom and r.GrpTo
then r.GrpNo
else r.GrpNo + 1
end,
GrpFrom= case when c.[From Date] between r.GrpFrom and dateadd(day, 1, r.GrpTo)
or c.[To Date] between r.GrpFrom and r.GrpTo
then case when c.[From Date] > r.GrpFrom then c.[From Date] else r.GrpFrom end
else c.[From Date]
end,
GrpTo = case when c.[From Date] between r.GrpFrom and dateadd(day, 1, r.GrpTo)
or c.[To Date] between r.GrpFrom and dateadd(day, 1, r.GrpTo)
then case when c.[To Date] > r.GrpTo then c.[To Date] else r.GrpTo end
else c.[To Date]
end
from rcte r
inner join cte c on r.[Group ID] = c.[Group ID]
and r.rn = c.rn - 1
)
select [Group ID], min(GrpFrom), max(GrpTo)
from rcte
group by [Group ID], GrpNo
这里和其他地方我注意到日期包装问题 不提供解决此问题的几何方法。 毕竟, 任何范围(包括日期范围)都可以解释为一条线。 那么为什么不将它们转换为 sql 几何类型并利用
geometry::UnionAggregate
合并范围。 于是我就一刺
与你的帖子一起。
在“数字”中:
在“mergeLines”中:
在外部查询中:
with
numbers as (
select row_number() over (order by (select null)) i
from @spans -- Where I put your data
),
mergeLines as (
select groupId,
lines = geometry::UnionAggregate(line)
from @spans
cross apply (select
startP = geometry::Point(convert(float,fromDate), 0, 0),
stopP = geometry::Point(convert(float,toDate) + 1, 0, 0)
) pointify
cross apply (select line = startP.STUnion(stopP).STEnvelope()) lineify
group by groupId
)
select groupId, fromDate, toDate
from mergeLines ml
join numbers n on n.i between 1 and ml.lines.STNumGeometries()
cross apply (select line = ml.lines.STGeometryN(i).STEnvelope()) l
cross apply (select
fromDate = convert(datetime, l.line.STPointN(1).STX),
toDate = convert(datetime, l.line.STPointN(3).STX) - 1
) unprepare
order by groupId, fromDate;
我完成了另一种几何方法,如下:
WITH
T_GEO AS
(
SELECT GroupID,
geometry::STLineFromText(
CONCAT('LINESTRING ('
, DATEDIFF(second, '20000101', FromDate),
' 0,',
DATEDIFF(second, '20000101', ToDate),
' 0'
,')'), 0) AS GEO
FROM THE_TABLE
),
T_GEOAGG AS
(
SELECT GroupID,
geometry::UnionAggregate(GEO) AS UGEO
FROM T_GEO
GROUP BY GroupID
)
SELECT GroupID,
CAST(DATEADD(second, UGEO.STGeometryN(value).STPointN(1).STX, '20000101') AS DATETIME2(0)) AS FromDate,
CAST(DATEADD(second, UGEO.STGeometryN(value).STPointN(2).STX, '20000101') AS DATETIME2(0)) AS ToDate
FROM T_GEOAGG
CROSS APPLY generate_series(1, UGEO.STNumGeometries());
经过 1000 次间隔测试,性能优于 @pwilcox 一次...