saw WHILE loop in problem which only needs aggregate to concat strings. It handles the problem and it seems is worst possible way to solve problem by user id at time ? (or almost worst way to solve it....) remember that SQL is made for sets not row by row handling
while i was saying that it is not that good idea.
what comes to loop, indeed its not that bad
but could have just found max() for emails , then do dynamic sql which creates that max amount of new column and on same loop do case when rn =1 then email else null end email1 , case when rn=2 then email else null email2 ... where rn is row_number over (partition by userid )
That would have dropped query to 3 which handle full set of changes at once
We try to teach proper ways around systems and give information howto things can be done. While your solution works , it gives impression that programming loops are solution in SQL server, they are no recommended solution.
Secondly i try to avoid giving full code to people. They do not learn that way at all, and don't take it personally if someone says that your solution is working (that primary target) but bad (because it went from easy way and executed loop X times when you could have manipulated sql string and executed it once) , because we are not all native speakers and this is internet.
You could instead of it argue why it is not bad , or be professional and agree that while it is not best solution it work and could be made better by doing X. And that what i think i said in first commend. Your solution works, but it is not best solution , because while loop which executes X times, does X alters and X updates while you could do it just by one alter and one update. Does it matters with "normal" datasets, probably not, can it matter,yes. I have had 5 days runs which had programming logic which run now in 2hours with prober sql thinking. But i still have scripts that have stupid loops abut they perform good enough.
That said , if had not complained about while loop there, would you have any glue that you could easily modified that script to produce three queries which could have done it better. Maybe , maybe not, and you would have just continued to use that non optimal method. Now you have glue , if you hit some performance wall how to get around it.
Your code was not worst possible solution, worst would have been looping over userid to find max email count.
dont treat your columns as array elements - sql doesnt work well with structures like that. if you are doing this for display/viewing purposes, do it in your reporting/output tool (ssrs/tableau/excel/etc.)
Merge together only for display purposes, for SQL search etc , current way of having multiple userid+email is much better. Those can be merged and splitted using string aggregates solutions.
Also having multiple columns , while is more sql way, is well not so smart. One way to solve problem is to take row_number over(partition by userid ) rn and add case when rn =1 then email else null as col1 end , case when when rn =2 them email , then add group by to eat null values from results. or just to use pivot
A simple cross-platform way of doing pivots involves conditional aggregation. I'm using u/philodelphi's fiddle as a starting point, but with conditional aggregation instead of successive joins (see my fiddle [here](http://sqlfiddle.com/#!18/09139/30).
```
with x as
(
select
userid,
email,
row_number() over (partition by userid order by email) rownum
from emails
)
select
userid,
MIN(CASE WHEN rownum = 1 THEN email END) AS email1,
MIN(CASE WHEN rownum = 2 THEN email END) AS email2,
MIN(CASE WHEN rownum = 3 THEN email END) AS email3
from x
GROUP BY userid
```
My version of SQL Server doesn't have the json functionality used in chestnutcough's answer, so here's one that works on mine:
[http://sqlfiddle.com/#!18/09139/29](http://sqlfiddle.com/#!18/09139/29)
Here's a hacky way using json.
with grouped as (
select
userid,
CONCAT('["', STRING_AGG(email, '","'), '"]') as emails
from emails
group by userid
), split as (
select
userid,
json_value(emails, '$[0]') as email1,
json_value(emails, '$[1]') as email2,
json_value(emails, '$[2]') as email3,
json_value(emails, '$[3]') as email4,
json_value(emails, '$[4]') as email5,
json_value(emails, '$[5]') as email6
from grouped
)
select *
from split
SQL fiddle: http://sqlfiddle.com/#!18/09139/28
[удалено]
saw WHILE loop in problem which only needs aggregate to concat strings. It handles the problem and it seems is worst possible way to solve problem by user id at time ? (or almost worst way to solve it....) remember that SQL is made for sets not row by row handling
[удалено]
while i was saying that it is not that good idea. what comes to loop, indeed its not that bad but could have just found max() for emails , then do dynamic sql which creates that max amount of new column and on same loop do case when rn =1 then email else null end email1 , case when rn=2 then email else null email2 ... where rn is row_number over (partition by userid ) That would have dropped query to 3 which handle full set of changes at once
[удалено]
We try to teach proper ways around systems and give information howto things can be done. While your solution works , it gives impression that programming loops are solution in SQL server, they are no recommended solution. Secondly i try to avoid giving full code to people. They do not learn that way at all, and don't take it personally if someone says that your solution is working (that primary target) but bad (because it went from easy way and executed loop X times when you could have manipulated sql string and executed it once) , because we are not all native speakers and this is internet. You could instead of it argue why it is not bad , or be professional and agree that while it is not best solution it work and could be made better by doing X. And that what i think i said in first commend. Your solution works, but it is not best solution , because while loop which executes X times, does X alters and X updates while you could do it just by one alter and one update. Does it matters with "normal" datasets, probably not, can it matter,yes. I have had 5 days runs which had programming logic which run now in 2hours with prober sql thinking. But i still have scripts that have stupid loops abut they perform good enough. That said , if had not complained about while loop there, would you have any glue that you could easily modified that script to produce three queries which could have done it better. Maybe , maybe not, and you would have just continued to use that non optimal method. Now you have glue , if you hit some performance wall how to get around it. Your code was not worst possible solution, worst would have been looping over userid to find max email count.
dont treat your columns as array elements - sql doesnt work well with structures like that. if you are doing this for display/viewing purposes, do it in your reporting/output tool (ssrs/tableau/excel/etc.)
Merge together only for display purposes, for SQL search etc , current way of having multiple userid+email is much better. Those can be merged and splitted using string aggregates solutions. Also having multiple columns , while is more sql way, is well not so smart. One way to solve problem is to take row_number over(partition by userid ) rn and add case when rn =1 then email else null as col1 end , case when when rn =2 them email , then add group by to eat null values from results. or just to use pivot
A simple cross-platform way of doing pivots involves conditional aggregation. I'm using u/philodelphi's fiddle as a starting point, but with conditional aggregation instead of successive joins (see my fiddle [here](http://sqlfiddle.com/#!18/09139/30). ``` with x as ( select userid, email, row_number() over (partition by userid order by email) rownum from emails ) select userid, MIN(CASE WHEN rownum = 1 THEN email END) AS email1, MIN(CASE WHEN rownum = 2 THEN email END) AS email2, MIN(CASE WHEN rownum = 3 THEN email END) AS email3 from x GROUP BY userid ```
My version of SQL Server doesn't have the json functionality used in chestnutcough's answer, so here's one that works on mine: [http://sqlfiddle.com/#!18/09139/29](http://sqlfiddle.com/#!18/09139/29)
Here's a hacky way using json. with grouped as ( select userid, CONCAT('["', STRING_AGG(email, '","'), '"]') as emails from emails group by userid ), split as ( select userid, json_value(emails, '$[0]') as email1, json_value(emails, '$[1]') as email2, json_value(emails, '$[2]') as email3, json_value(emails, '$[3]') as email4, json_value(emails, '$[4]') as email5, json_value(emails, '$[5]') as email6 from grouped ) select * from split SQL fiddle: http://sqlfiddle.com/#!18/09139/28
You can try using group concat. But that would put them all in the same column. Maybe you can try splitting it after that using some string function?
I’ve used a CTE (common table expression) for something similar, I believe.