SPLASH 2022
Mon 5 - Sat 10 December 2022 Auckland, New Zealand
Wed 30 Nov 2022 22:00 - 22:15 at Virtual Airmeet Room - Session 2 Chair(s): Sophia Drossopoulou

Most users of low-code platforms, such as Excel and PowerApps, write programs in domain-specific formula languages to carry out nontrivial tasks. Often users can write most of the program they want, but introduce small mistakes that yield broken formulas. These mistakes, which can be both syntactic and semantic, are hard for low-code users to identify and fix, even though they can be resolved with just a few edits. We formalize the problem of producing such edits as the \emph{last-mile repair} problem. To address this problem, we developed LaMirage, a LAst-MIle RepAir-engine GEnerator that combines symbolic and neural techniques to perform last-mile repair in low-code formula languages. LaMirage takes a grammar and a set of domain-specific constraints/rules, which jointly approximate the target language, and uses these to generate a repair engine that can fix formulas in that language. To tackle the challenges of localizing errors and ranking candidate repairs, LaMirage leverages neural techniques, whereas it relies on symbolic methods to generate candidate edits. This combination allows LaMirage to find repairs that satisfy the provided grammar and constraints, and then pick the most natural repair. We compare LaMirage to state-of-the-art neural and symbolic approaches on 400 real Excel and Power Fx formulas, where LaMirage outperforms all baselines. We release these benchmarks to encourage subsequent work in low-code domains.

Wed 30 Nov

Displayed time zone: Auckland, Wellington change

21:00 - 22:45
Session 2V-OOPSLA at Virtual Airmeet Room
Chair(s): Sophia Drossopoulou Meta and Imperial College London
21:00
15m
Talk
Taming Transitive Redundancy for Context-Free Language ReachabilityPre-recorded
V-OOPSLA
Yuxiang Lei University of Technology Sydney, Yulei Sui University of New South Wales, Sydney, Shuo Ding Georgia Institute of Technology, Qirun Zhang Georgia Institute of Technology
DOI
21:15
15m
Talk
Scalable Linear Invariant Generation with Farkas’ Lemma
V-OOPSLA
Hongming Liu Shanghai Jiao Tong University, Hongfei Fu Shanghai Jiao Tong University, zhiyong yu Shanghai Jiao Tong University, Jiaxin Song Shanghai Jiao Tong University, Guoqiang Li Shanghai Jiao Tong University
DOI
21:30
15m
Talk
Consistency-Preserving Propagation for SMT Solving of Concurrent Program VerificationPre-recorded
V-OOPSLA
Zhihang Sun Tsinghua University, Hongyu Fan Tsinghua University, Fei He Tsinghua University
DOI
21:45
15m
Talk
Oracle-Free Repair Synthesis for Floating-Point ProgramsPre-recorded
V-OOPSLA
Daming Zou ETH Zurich, Yuchen Gu Peking University, Yuanfeng Shi Peking University, Mingzhe Wang Princeton University, Yingfei Xiong Peking University, Zhendong Su ETH Zurich
DOI
22:00
15m
Talk
Neurosymbolic Repair for Low-Code Formula Languages
V-OOPSLA
Rohan Bavishi University of California at Berkeley, Harshit Joshi Microsoft, José Pablo Cambronero Microsoft, Anna Fariha Microsoft, Sumit Gulwani Microsoft, Vu Le Microsoft, Ivan Radiček Microsoft, Ashish Tiwari Microsoft
DOI
22:15
30m
Live Q&A
Q&A for Session 2
V-OOPSLA