About me
Senior Machine Learning Engineer at Kavak.com. I have completed my master’s degree in Data Science at ETH Zurich. I obtained my Bachelor’s degree in Telematics Engineering from Instituto Politecnico Nacional (IPN), Mexico.
I am interested in building safe and trustworthy AI systems with a special interest in natural language processing applications.
I also worked as a Data Scientist at OPI Analytics in Mexico City.
Publications
Analyzing the Role of Semantic Representations in the Era of Large Language Models
Zhijing Jin, Yuen Chen, Fernando Gonzalez, Jiarui Liu, Jiayi Zhang, Julian Michael, Bernhard Schölkopf, Mona Diab
NAACL 2024
Paper Code
CLADDER: Assessing Causal Reasoning in Language Models
Zhijing Jin, Yuen Chen, Felix Leeb, Ojasv Kamal, Zhiheng Lyu, Kevin Blin, Fernando Gonzalez, Max Kleiman-Weiner, Luigi Gresele, Mrinmaya Sachan, Bernhard Schölkopf
NeurIPS 2023
Paper Code
Beyond Good Intentions: Reporting the Research Landscape of NLP for Social Good
Fernando Gonzalez*, Zhijing Jin*, Bernhard Schölkopf, Tom Hope, Mrinmaya Sachan, Rada Mihalcea
EMNLP 2023
Paper Code
When to Make Exceptions: Exploring Language Models as Accounts of Human Moral Judgment
Zhijing Jin*, Sydney Levine*, Fernando Gonzalez*, Ojasv Kamal, Maarten Sap, Mrinmaya Sachan, Rada Mihalcea, Josh Tenenbaum, Bernhard Schölkopf
NeurIPS 2022
Paper Code
Projects
Website for LYLA dataset
I collaborated in the development of the website that presents the Lynching in Latin America (LYLA) dataset created by researchers of the Center for Security Studies in ETH Zurich
Fernando Gonzalez and Cristina Guzman
2022
NLP Analysis of Legal Briefs
In this paper we predict the appeal outcome decision for legal briefs submitted to U.S. appellate courts.
Anver Khusainov, Clémence Lanfranchi, Fernando Gonzalez, Naman Goel, Dominik Stammbach, and Elliott Ash
Data Science Lab @ ETH Zurich 2021
Paper
The Best TIM: Human Motion Prediction
Given human motion sequences, we predict how the motion continues for future frames using a Temporal Inception Module (TIM).
Fernando Gonzalez, Cristina Guzman, and Zixin Shu
Final project, Machine Perception @ ETH Zurich 2021
Paper
Distraction Generation Using T5
In this paper we generate distract answers for multiple-choice questions given a reference text, question, and the correct answer. We fine tune a transformer (T5) on the distraction generation task with an additional term in the loss function to penalize distractors similar to the correct answer.
Fernando Gonzalez and Kevin Golan
Final project, Natural Language Procesing @ ETH Zurich 2020
Paper Code