[ Upstream commit
0aeccc63f3bc4cfd49dc4893da1409402ee6b295 ]
The driver uses the upper-bound approach to decide the target JPEG
encode quality, but there's a logic bug that if the desired quality is
higher than what the driver can support, the driver falls back to using
the worst quality.
Fix the bug by assuming using the best quality in the beginning, and
with trivial refactor to avoid long lines.
Fixes: 45f13a57d813 ("media: platform: Add jpeg enc feature")
Signed-off-by: Fei Shao <fshao@chromium.org>
Reviewed-by: Chen-Yu Tsai <wenst@chromium.org>
Signed-off-by: Hans Verkuil <hverkuil-cisco@xs4all.nl>
Signed-off-by: Sasha Levin <sashal@kernel.org>
u32 img_stride;
u32 mem_stride;
u32 i, enc_quality;
+ u32 nr_enc_quality = ARRAY_SIZE(mtk_jpeg_enc_quality);
value = width << 16 | height;
writel(value, base + JPEG_ENC_IMG_SIZE);
writel(img_stride, base + JPEG_ENC_IMG_STRIDE);
writel(mem_stride, base + JPEG_ENC_STRIDE);
- enc_quality = mtk_jpeg_enc_quality[0].hardware_value;
- for (i = 0; i < ARRAY_SIZE(mtk_jpeg_enc_quality); i++) {
+ enc_quality = mtk_jpeg_enc_quality[nr_enc_quality - 1].hardware_value;
+ for (i = 0; i < nr_enc_quality; i++) {
if (ctx->enc_quality <= mtk_jpeg_enc_quality[i].quality_param) {
enc_quality = mtk_jpeg_enc_quality[i].hardware_value;
break;