Promising Directions for Future Work
Sergey I. Nikolenko ()
Additional contact information
Sergey I. Nikolenko: Synthesis AI
Chapter Chapter 12 in Synthetic Data for Deep Learning, 2021, pp 285-294 from Springer
Abstract:
Abstract In this concluding chapter, we discuss the next steps that we can expect from the field of synthetic data for deep learning. We consider four different ideas that are starting to gain traction in this field. First, procedural generation of synthetic data can allow for much larger synthetic datasets or datasets generated on the fly. Second, recent works try to make the shift from domain randomization to the generation feedback loop, adapting synthetic data generation to the model and problem at hand. Third, we discuss how to best incorporate additional knowledge into the domain adaptation architectures, and fourth, show examples of introducing extra modalities into synthetic datasets with the purpose to improve downstream tasks that formally might not even use these modalities.
Date: 2021
References: Add references at CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:spochp:978-3-030-75178-4_12
Ordering information: This item can be ordered from
http://www.springer.com/9783030751784
DOI: 10.1007/978-3-030-75178-4_12
Access Statistics for this chapter
More chapters in Springer Optimization and Its Applications from Springer
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().