decode: validate shader type

Fix found thanks to american fuzzy lop.

Signed-off-by: Marc-André Lureau <marcandre.lureau@redhat.com>
macos/master
Marc-André Lureau 9 years ago committed by Dave Airlie
parent ad4f0f1941
commit 36492a4012
  1. 4
      src/vrend_decode.c

@ -219,6 +219,10 @@ static int vrend_decode_set_constant_buffer(struct vrend_decode_ctx *ctx, uint16
shader = get_buf_entry(ctx, VIRGL_SET_CONSTANT_BUFFER_SHADER_TYPE);
index = get_buf_entry(ctx, VIRGL_SET_CONSTANT_BUFFER_INDEX);
if (shader >= PIPE_SHADER_TYPES)
return EINVAL;
vrend_set_constants(ctx->grctx, shader, index, nc, get_buf_ptr(ctx, VIRGL_SET_CONSTANT_BUFFER_DATA_START));
return 0;
}

Loading…
Cancel
Save