当前位置: 首页 > 面试经验 >

2023届暑期实习面经:美团-商业分析师

优质
小牛编辑
106浏览
2023-03-28

2023届暑期实习面经:美团-商业分析师

TimeLine:一面20220420(已挂)

当时的BG:北邮本硕,管理类专业,两段实习经历:字节数据分析师、美团商业分析师

写在前面的话:该文档记录2023届暑期实习面试的相关问题,因此时间线分布在2022年

一面

1.SQL代码

面试时会共享屏幕在本地进行编写,不要求跑通,提供大致思路即可。

下述解答可能存在问题,敬请读者批评指正。

现在有两张数据表,分别为:

订单表 orders,字段有:order_id '订单id',user_id '用户id',bikeid '车辆ID',

dt'订单日期20220101',starttime'订单开始时间 2022-01-01 12:00:00', endtime'订单结束时间 '

用户表 users,字段有:user_id '用户id',account_name '用户登陆名'

问题如下:

1) 找出22年1月订单最多的用户登录名


select user_id, count(order_id) as order_cnt
from users join orders using(user_id)
where dt like '202201%'
group by user_id
having order_cnt = max(order_cnt)

2) 有人可能一天有若干订单,单天订单次数太多可能有作弊风险。找出22年1月每个用户的单天订单最高次数,以及每个人单天订单次数最高那天的日期


wtih a as
(
select user_id, dt, count(order_id) as cnt
from orders
group by user_id, dt
),
with b as
(
select user_id, dt, cnt,
rank() over(partition by user_id order by cnt desc) as rn
from a
)
select user_id, dt, cnt
from b
where rn = 1

3) 计算在2022年1月份车周转率分布,并按照如下两种格式输出周转率:

注:周转率是指单车在统计时间内被骑行次数

格式1:


select tmp.ride_cnt as transfer_rate, count(distinct tmp.bike_id) as bike_cnt
from
(select bike_id, count(order_id) as ride_cnt
from users join orders using(user_id)
where dt like '订单日期202201__'
group by bike_id) as tmp
group by tmp.ride_cnt

格式2:


with a as
(
select bike_id, count(order_id) as ride_cnt
from users join orders using(user_id)
where dt like '订单日期202201__'
group by bike_id
),
with b as
(
select case when ride_cnt between 1 and 5 then '1-5'
when ride_cnt between 6 and 10 then '6-10' end as transfer_rate,
bike_cnt
from a
)
select transfer_rate, sum(bike_cnt)
from b
group by transfer_rate

4) 时间范围为2022年1月,计算一辆车被骑行后,平均多久时间,会产生下一次骑行


with a as
(
select bike_id, start_time, end_time, lag(start_time, 1, 0) over(partition by bike_id order by start_time) as next_start
from orders
where dt like '202201%'
),
with b as
(
select bike_id, timestampdiff(hour, end_time, next_start) as time_on_hold
from a
where next_start <> 0
)
select bike_id, avg(time_on_hold)
from b
group by bike_id

#美团##暑期实习##数据分析#
 类似资料: