Abstract: Multitask learning with a pretext task has excelled in time-series classification task lacking labeled data. The key to multitask learning is to build a pretext task and learn the most ...
Abstract: We propose a semi-supervised ordinal classification method based on ranking consistency regularization, addressing limitations in capturing ordinal relationships and mitigating semantic ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results