B-spline curve approximation with transformer neural networks
Mathis Saillot,
Dominique Michel and
Ahmed Zidna
Mathematics and Computers in Simulation (MATCOM), 2024, vol. 223, issue C, 275-287
Abstract:
Approximating a curve with a B-spline is a well-known problem with many challenges. Computing parametric values and knot vector that leads to the best approximation of a point sequence is an open problem. Existing methods are usually based on heuristics, genetic algorithms, or meta-heuristics. Nowadays, Deep Neural Networks have demonstrated their usefulness as shown in the use of a Multi-Layer Perceptron in the existing literature. Since its inception, the Transformer architecture has achieved state-of-the-art in multiple domains, like Natural Language Processing and Computer Vision. In this paper, we propose a method for knot placement that focuses on using a Transformer neural network architecture for B-spline approximation. We present and compare the results of our ongoing experimentations with Transformers for B-spline curve approximation. We conclude with possible improvements and modifications to our method for future experiments.
Keywords: B-spline; Curve approximation; Transformer neural network; Knot vector prediction (search for similar items in EconPapers)
Date: 2024
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0378475424001368
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:matcom:v:223:y:2024:i:c:p:275-287
DOI: 10.1016/j.matcom.2024.04.010
Access Statistics for this article
Mathematics and Computers in Simulation (MATCOM) is currently edited by Robert Beauwens
More articles in Mathematics and Computers in Simulation (MATCOM) from Elsevier
Bibliographic data for series maintained by Catherine Liu ().