Cross-Lingual Effectiveness of BERT on Aspect-Based Sentiment Analysis
Evaluated model performance when transferring learning across different languages in sentiment analysis tasks and recommended effective approaches.
Description
We applied multilingual BERT (mBERT) on the document-level laptop review dataset from SemEval-2016 for aspect-based sentiment analysis task and tested the model’s ability to perform zero-shot crosslingual learning transfer from English to Chinese. Results suggest multilingual BERT’s ability to transfer its learning of complex text relationship at document-level from English to Chinese.
We also demonstrated and discussed about the challenges from imbalanced data distribution for aspect-based sentiment analysis with a large number of aspect categories.
Techniques
- NLP
- text embedding
- cloud computing
Tools
- PyTorch
- BERT
- GCP
- Pandas
- Matplotlib
More Information
More information can be found at the following links: