EconPapers    
Economics at your fingertips  
 

A Neural Network Model Secret-Sharing Scheme with Multiple Weights for Progressive Recovery

Xianhui Wang, Hong Shan, Xuehu Yan, Long Yu and Yongqiang Yu
Additional contact information
Xianhui Wang: College of Electronic Engineering, National University of Defense Technology, Hefei 230037, China
Hong Shan: College of Electronic Engineering, National University of Defense Technology, Hefei 230037, China
Xuehu Yan: College of Electronic Engineering, National University of Defense Technology, Hefei 230037, China
Long Yu: College of Electronic Engineering, National University of Defense Technology, Hefei 230037, China
Yongqiang Yu: College of Electronic Engineering, National University of Defense Technology, Hefei 230037, China

Mathematics, 2022, vol. 10, issue 13, 1-17

Abstract: With the widespread use of deep-learning models in production environments, the value of deep-learning models has become more prominent. The key issues are the rights of the model trainers and the security of the specific scenarios using the models. In the commercial domain, consumers pay different fees and have access to different levels of services. Therefore, dividing the model into several shadow models with multiple weights is necessary. When holders want to use the model, they can recover the model whose performance corresponds to the number and weights of the collected shadow models so that access to the model can be controlled progressively, i.e., progressive recovery is significant. This paper proposes a neural network model secret sharing scheme (NNSS) with multiple weights for progressive recovery. The scheme uses Shamir’s polynomial to control model parameters’ sharing and embedding phase, which in turn enables hierarchical performance control in the secret model recovery phase. First, the important model parameters are extracted. Then, effective shadow parameters are assigned based on the holders’ weights in the sharing phase, and t shadow models are generated. The holders can obtain a sufficient number of shadow parameters for recovering the secret parameters with a certain probability during the recovery phase. As the number of shadow models obtained increases, the probability becomes larger, while the performance of the extracted models is related to the participants’ weights in the recovery phase. The probability is proportional to the number and weights of the shadow models obtained in the recovery phase, and the probability of the successful recovery of the shadow parameters is 1 when all t shadow models are obtained, i.e., the performance of the reconstruction model can reach the performance of the secret model. A series of experiments conducted on VGG19 verify the effectiveness of the scheme.

Keywords: secret sharing; neural network model; progressive recovery; multiple weights (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2022
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/10/13/2231/pdf (application/pdf)
https://www.mdpi.com/2227-7390/10/13/2231/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:10:y:2022:i:13:p:2231-:d:847963

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jmathe:v:10:y:2022:i:13:p:2231-:d:847963