Repositorio Institucional
Repositorio Institucional
CONICET Digital
  • Inicio
  • EXPLORAR
    • AUTORES
    • DISCIPLINAS
    • COMUNIDADES
  • Estadísticas
  • Novedades
    • Noticias
    • Boletines
  • Ayuda
    • General
    • Datos de investigación
  • Acerca de
    • CONICET Digital
    • Equipo
    • Red Federal
  • Contacto
JavaScript is disabled for your browser. Some features of this site may not work without it.
  • INFORMACIÓN GENERAL
  • RESUMEN
  • ESTADISTICAS
 
Evento

Alternating Local Enumeration (TnALE): solving tensor network structure search with fewer evaluations

Li, Chao; Zeng, Junhua; Li, Chunmei; Caiafa, César FedericoIcon ; Zhao, Qibin
Colaboradores: Lawrence, Neil; Krause, Andreas
Tipo del evento: Conferencia
Nombre del evento: 40th International Conference on Machine Learning
Fecha del evento: 23/07/2023
Institución Organizadora: International Council for Machinery Lubrication;
Título de la revista: Proceedings of Machine Learning Research
Editorial: MLR Press
ISSN: 2640-3498
Idioma: Inglés
Clasificación temática:
Otras Ciencias de la Computación e Información

Resumen

Tensor network (TN) is a powerful framework in machine learning, but selecting a good TN model, known as TN structure search (TN-SS), is a challenging and computationally intensive task. The recent approach TNLS (Li et al., 2022) showed promising results for this task. However, its computational efficiency is still unaffordable, requiring too many evaluations of the objective function. We propose TnALE, a surprisingly simple algorithm that updates each structure-related variable alternately by local enumeration, greatly reducing the number of evaluations compared to TNLS. We theoretically investigate the descent steps for TNLS and TnALE, proving that both the algorithms can achieve linear convergence up to a constant if a sufficient reduction of the objective is reached in each neighborhood. We further compare the evaluation efficiency of TNLS and TnALE, revealing that Ω(2K) evaluations are typically required in TNLS for reaching the objective reduction, while ideally O(KR) evaluations are sufficient in TnALE, where K denotes the dimension of search space and R reflects the “low-rankness” of the neighborhood. Experimental results verify that TnALE can find practically good TN structures with vastly fewer evaluations than the state-of-the-art algorithms.
Palabras clave: Tensor Network , Signal Processing , Machine Learning
Ver el registro completo
 
Archivos asociados
Thumbnail
 
Tamaño: 2.605Mb
Formato: PDF
.
Descargar
Licencia
info:eu-repo/semantics/openAccess Excepto donde se diga explícitamente, este item se publica bajo la siguiente descripción: Creative Commons Attribution-NonCommercial-ShareAlike 2.5 Unported (CC BY-NC-SA 2.5)
Identificadores
URI: http://hdl.handle.net/11336/221893
URL: http://proceedings.mlr.press/v202/li23ar/li23ar.pdf
URL: https://proceedings.mlr.press/v202/
Colecciones
Eventos(IAR)
Eventos de INST.ARG.DE RADIOASTRONOMIA (I)
Citación
Alternating Local Enumeration (TnALE): solving tensor network structure search with fewer evaluations; 40th International Conference on Machine Learning; Honolulu; Estados Unidos; 2023; 20384-20411
Compartir

Enviar por e-mail
Separar cada destinatario (hasta 5) con punto y coma.
  • Facebook
  • X Conicet Digital
  • Instagram
  • YouTube
  • Sound Cloud
  • LinkedIn

Los contenidos del CONICET están licenciados bajo Creative Commons Reconocimiento 2.5 Argentina License

https://www.conicet.gov.ar/ - CONICET

Inicio

Explorar

  • Autores
  • Disciplinas
  • Comunidades

Estadísticas

Novedades

  • Noticias
  • Boletines

Ayuda

Acerca de

  • CONICET Digital
  • Equipo
  • Red Federal

Contacto

Godoy Cruz 2290 (C1425FQB) CABA – República Argentina – Tel: +5411 4899-5400 repositorio@conicet.gov.ar
TÉRMINOS Y CONDICIONES