Topic 1 Question 303
You work for a media company that operates a streaming movie platform where users can search for movies in a database. The existing search algorithm uses keyword matching to return results. Recently, you have observed an increase in searches using complex semantic queries that include the movies’ metadata such as the actor, genre, and director.
You need to build a revamped search solution that will provide better results, and you need to build this proof of concept as quickly as possible. How should you build the search platform?
Use a foundational large language model (LLM) from Model Garden as the search platform’s backend.
Configure Vertex AI Vector Search as the search platform’s backend.
Use a BERT-based model and host it on a Vertex AI endpoint.
Create the search platform through Vertex AI Agent Builder.
ユーザの投票
コメント(2)
- 正解だと思う選択肢: B
B. Configure Vertex AI Vector Search as the search platform’s backend. Why Option B? Best for Semantic Search & Metadata Queries
Keyword-based search is insufficient for complex semantic queries (e.g., "Find action movies starring Tom Cruise directed by Christopher Nolan"). Vertex AI Vector Search supports vector embeddings, which enable semantic similarity search instead of exact keyword matching. Fast Proof of Concept with Minimal Effort
Pre-built solution for semantic search with high scalability. No need to manually train a model—simply generate embeddings from movie metadata (actors, genre, director, etc.) and store them in Vertex AI Vector Search. Scalable and High-Performance Search Engine
Optimized for low-latency searches and retrieves the most relevant results quickly. Works well with multi-dimensional search queries, making it ideal for metadata-rich movie searches.
👍 1tk7867862025/02/19 - 正解だと思う選択肢: B
Answer B. Vector search is more efficient for 'Search' based queries.
- A: Makes sense and easily deployable, but this is 'Search' and LLMs typically are for more conversational applications that may not prioritize speed.
- C: BERT unnecessary complexity and training.
- D: Would work, but Agents are more geared toward conversation and results have higher latency compared to vector search.
👍 15091a992025/03/04
シャッフルモード