Repositorio Institucional
Repositorio Institucional
CONICET Digital
  • Inicio
  • EXPLORAR
    • AUTORES
    • DISCIPLINAS
    • COMUNIDADES
  • Estadísticas
  • Novedades
    • Noticias
    • Boletines
  • Ayuda
    • General
    • Datos de investigación
  • Acerca de
    • CONICET Digital
    • Equipo
    • Red Federal
  • Contacto
JavaScript is disabled for your browser. Some features of this site may not work without it.
  • INFORMACIÓN GENERAL
  • RESUMEN
  • ESTADISTICAS
 
Artículo

Integrating Bayesian and neural networks models for eye movement prediction in hybrid search

Ruarte, GonzaloIcon ; Bujía, Gastón EliánIcon ; Care, Damian ArielIcon ; Ison, Matias Julian; Kamienkowski, Juan EstebanIcon
Fecha de publicación: 05/2025
Editorial: Nature
Revista: Scientific Reports
ISSN: 2045-2322
Idioma: Inglés
Tipo de recurso: Artículo publicado
Clasificación temática:
Otras Ciencias de la Computación e Información

Resumen

Visual search is crucial in daily human interaction with the environment. Hybrid search extends this by requiring observers to find any item from a given set. Recently, a few models were proposed to simulate human eye movements in visual search tasks within natural scenes, but none were implemented for Hybrid search under similar conditions. We present an enhanced neural network Entropy Limit Minimization (nnELM) model, grounded in a Bayesian framework and signal detection theory, and the Hybrid Search Eye Movements (HSEM) Dataset, containing thousands of human eye movements during hybrid tasks. A key Hybrid search challenge is that participants have to look for different objects at the same time. To address this, we developed several strategies involving the posterior probability distributions after each fixation. Adjusting peripheral visibility improved early-stage efficiency, aligning it with human behavior. Limiting the model’s memory reduced success in longer searches, mirroring human performance. We validated these improvements by comparing our model with a held-out set within the HSEM and with other models in a separate visual search benchmark. Overall, the new nnELM model not only handles Hybrid search in natural scenes but also closely replicates human behavior, advancing our understanding of search processes while maintaining interpretability.
Palabras clave: eye movements , deep neural networks , bayesian models , computational models , hybrid search , visual search
Ver el registro completo
 
Archivos asociados
Thumbnail
 
Tamaño: 2.706Mb
Formato: PDF
.
Descargar
Licencia
info:eu-repo/semantics/openAccess Excepto donde se diga explícitamente, este item se publica bajo la siguiente descripción: Atribución-NoComercial-SinDerivadas 2.5 Argentina (CC BY-NC-ND 2.5 AR)
Identificadores
URI: http://hdl.handle.net/11336/274513
URL: https://www.nature.com/articles/s41598-025-00272-3
DOI: http://dx.doi.org/10.1038/s41598-025-00272-3
Colecciones
Articulos(ICC)
Articulos de INSTITUTO DE INVESTIGACION EN CIENCIAS DE LA COMPUTACION
Citación
Ruarte, Gonzalo; Bujía, Gastón Elián; Care, Damian Ariel; Ison, Matias Julian; Kamienkowski, Juan Esteban; Integrating Bayesian and neural networks models for eye movement prediction in hybrid search; Nature; Scientific Reports; 15; 1; 5-2025; 1-15
Compartir
Altmétricas
 

Enviar por e-mail
Separar cada destinatario (hasta 5) con punto y coma.
  • Facebook
  • X Conicet Digital
  • Instagram
  • YouTube
  • Sound Cloud
  • LinkedIn

Los contenidos del CONICET están licenciados bajo Creative Commons Reconocimiento 2.5 Argentina License

https://www.conicet.gov.ar/ - CONICET

Inicio

Explorar

  • Autores
  • Disciplinas
  • Comunidades

Estadísticas

Novedades

  • Noticias
  • Boletines

Ayuda

Acerca de

  • CONICET Digital
  • Equipo
  • Red Federal

Contacto

Godoy Cruz 2290 (C1425FQB) CABA – República Argentina – Tel: +5411 4899-5400 repositorio@conicet.gov.ar
TÉRMINOS Y CONDICIONES