Blackboard Computing Adventures π‘
Today we reviewed another keynote delivered by a distinguished person in the ACM SLE community; Professor Martin Erwig. This 2009 presentation is quite important in the SLE field not just because of having occurred in the earliest years of the SLE field, butβ¦
---[Brief Bio]:
Martin Erwig is a Professor of Computer Science in the School of Electrical Engineering and Computer Science at Oregon State University[1]. He has a rich academic background with degrees from the University of Dortmund and the University of Hagen in Germany[1]. His research interests include language design and domain-specific languages, functional programming, and visual languages[1].
In 2000 he immigrated from Germany into the United States. He lives now with his family in Corvallis, Oregon [2]. Prof. Erwig is also the author of the award-winning book "Once Upon an Algorithm: How Stories Explain Computing", which has been translated into several languages[1]. He has published over 160 peer-reviewed articles and received multiple best paper awards for his work[1].
Martin Erwig is a Professor of Computer Science in the School of Electrical Engineering and Computer Science at Oregon State University[1]. He has a rich academic background with degrees from the University of Dortmund and the University of Hagen in Germany[1]. His research interests include language design and domain-specific languages, functional programming, and visual languages[1].
In 2000 he immigrated from Germany into the United States. He lives now with his family in Corvallis, Oregon [2]. Prof. Erwig is also the author of the award-winning book "Once Upon an Algorithm: How Stories Explain Computing", which has been translated into several languages[1]. He has published over 160 peer-reviewed articles and received multiple best paper awards for his work[1].
Blackboard Computing Adventures π‘
Photo
---[About Paper]:
First, this 1-page paper was delivered as part of a keynote talk at the 3rd International SLE conference that took place in Eindhoven, Netherlands in 2010[3]. It touches on his work concerning the Choice Calculus (CC) meant to formalize as well as streamline future work relating to variability in software artefacts, systems or expressions[4]. It relates to popular work in the Programming Language Engineering field by its relation to the Lambda Calculus[4], and though not presented in this abstract paper, the CC's language syntax, semantics and potential applications are introduced; Choices & Dimensions that group Choices.
---[REFS]:
1. https://engineering.oregonstate.edu/people/martin-erwig
2. https://www.amazon.in/stores/author/B004575Y1O/about
3. https://www.sleconf.org/2010/
4. Erwig, M. (2010). A language for software variation research. ACM SIGPLAN Notices, 46(2), 3-12. URL: https://web.engr.oregonstate.edu/~erwig/papers/VariationLang_GPCE10.pdf
First, this 1-page paper was delivered as part of a keynote talk at the 3rd International SLE conference that took place in Eindhoven, Netherlands in 2010[3]. It touches on his work concerning the Choice Calculus (CC) meant to formalize as well as streamline future work relating to variability in software artefacts, systems or expressions[4]. It relates to popular work in the Programming Language Engineering field by its relation to the Lambda Calculus[4], and though not presented in this abstract paper, the CC's language syntax, semantics and potential applications are introduced; Choices & Dimensions that group Choices.
---[REFS]:
1. https://engineering.oregonstate.edu/people/martin-erwig
2. https://www.amazon.in/stores/author/B004575Y1O/about
3. https://www.sleconf.org/2010/
4. Erwig, M. (2010). A language for software variation research. ACM SIGPLAN Notices, 46(2), 3-12. URL: https://web.engr.oregonstate.edu/~erwig/papers/VariationLang_GPCE10.pdf
Blackboard Computing Adventures π‘
Photo
Hello students, researchers, faculty and visitors! Perhaps we'll take a break from usual Blackboard Adventures around Christmas---for good measure, nonetheless, we'll likely want to quickly wrap up some unfinished R&D business before 2024 closes. So, wish me well π€ππ€
Reviewing ACM SLE papers continues... ππ»ππ»
With a somewhat long list of co-authors, the "Trellis paper" [3] (not to be confused with "Tetris" the game) is among the most exciting SLE papers I've come across thus far. Published this year and surprisingly spearheaded by a PhD student; Lars Hummelgren (and not some professor such as many past ACM SLE papers we've covered here on BA), this is such a terrific work touching on several important matters in contemporary Machine Learning research. Interestingly, Lars's second name almost feels like the HMMs he's researching here!
Blackboard Computing Adventures π‘
With a somewhat long list of co-authors, the "Trellis paper" [3] (not to be confused with "Tetris" the game) is among the most exciting SLE papers I've come across thus far. Published this year and surprisingly spearheaded by a PhD student; Lars Hummelgrenβ¦
---[Brief Bio]:
Lars Hummelgren is a PhD student at the Division of Software and Computer Systems at KTH Royal Institute of Technology in Stockholm, Sweden[1]. His research interests include programming languages, type systems, and efficient compilation to CPUs and GPUs[1]. He has published several papers on topics such as GPU compilation, domain-specific languages, and probabilistic programming[2].
Lars Hummelgren is a PhD student at the Division of Software and Computer Systems at KTH Royal Institute of Technology in Stockholm, Sweden[1]. His research interests include programming languages, type systems, and efficient compilation to CPUs and GPUs[1]. He has published several papers on topics such as GPU compilation, domain-specific languages, and probabilistic programming[2].
Blackboard Computing Adventures π‘
Photo
---[About Paper]:
The paper[3] is about a popular machine/statistical learning technique; HMMs; Hidden Markov Models, specifically about the sparse kind. It has a great mini intro to HMMs for newcomers too. As with most SLE papers, focus is on a new software language; Trellis in this case, which provably and empirically has proven to be more performant in time-series learning problems involving sparse datasets (arguably more representative of realistic problems). Trellis is a nonexecutable DSL[4] that takes a Trellis model of a sparse HMM and compiles it into a Python library with which CUDA-powered learning can be executed via a clean API[3]. Paper delves into the details of that process, as well as discussing interesting related previous work too.
---[REFS]:
1. https://www.kth.se/profile/larshum
2. https://scholar.google.com/citations?user=l6ppBDIAAAAJ&hl=en
3. Hummelgren, L., Palmkvist, V., Stjerna, L., Xu, X., JaldΓ©n, J., & Broman, D. (2024, October). Trellis: A Domain-Specific Language for Hidden Markov Models with Sparse Transitions. In Proceedings of the 17th ACM SIGPLAN International Conference on Software Language Engineering (pp. 196-209). URL: https://dl.acm.org/doi/pdf/10.1145/3687997.3695641
4. Lutalo, Joseph Willrich. "Software Language Engineering-Text Processing Language Design, Implementation, Evaluation Methods." (2024). URL: https://www.preprints.org/frontend/manuscript/3903e4cd075074a7005cb705a5ef26c5/download_pub
#review #notes #acm #sle #jwl #phd
The paper[3] is about a popular machine/statistical learning technique; HMMs; Hidden Markov Models, specifically about the sparse kind. It has a great mini intro to HMMs for newcomers too. As with most SLE papers, focus is on a new software language; Trellis in this case, which provably and empirically has proven to be more performant in time-series learning problems involving sparse datasets (arguably more representative of realistic problems). Trellis is a nonexecutable DSL[4] that takes a Trellis model of a sparse HMM and compiles it into a Python library with which CUDA-powered learning can be executed via a clean API[3]. Paper delves into the details of that process, as well as discussing interesting related previous work too.
---[REFS]:
1. https://www.kth.se/profile/larshum
2. https://scholar.google.com/citations?user=l6ppBDIAAAAJ&hl=en
3. Hummelgren, L., Palmkvist, V., Stjerna, L., Xu, X., JaldΓ©n, J., & Broman, D. (2024, October). Trellis: A Domain-Specific Language for Hidden Markov Models with Sparse Transitions. In Proceedings of the 17th ACM SIGPLAN International Conference on Software Language Engineering (pp. 196-209). URL: https://dl.acm.org/doi/pdf/10.1145/3687997.3695641
4. Lutalo, Joseph Willrich. "Software Language Engineering-Text Processing Language Design, Implementation, Evaluation Methods." (2024). URL: https://www.preprints.org/frontend/manuscript/3903e4cd075074a7005cb705a5ef26c5/download_pub
#review #notes #acm #sle #jwl #phd
www.kth.se
KTH | Lars Hummelgren
Lars Hummelgren, Arbetar vid: PROGRAMVARUTEKN & DATORSYSTEM, E-post: larshum@kth.se, Adress: KISTAGΓ
NGEN 16
Blackboard Computing Adventures π‘
---[About Paper]: The paper[3] is about a popular machine/statistical learning technique; HMMs; Hidden Markov Models, specifically about the sparse kind. It has a great mini intro to HMMs for newcomers too. As with most SLE papers, focus is on a new softwareβ¦
Some highlights of my review of the Lars's 2024 Trellis paper.
Forwarded from Museum of ~{MAZERA}~
https://youtu.be/pDj3gp8UXRA?feature=shared
ππ The OFFICIAL MAZERA Band Christmas 25 DECEMBER live concert trailer. Action-packed, never heard of before materials & more... Direct from the Deep Metal Scene of Contemporary Africa
ππ The OFFICIAL MAZERA Band Christmas 25 DECEMBER live concert trailer. Action-packed, never heard of before materials & more... Direct from the Deep Metal Scene of Contemporary Africa
Voice_Assistants_Leveraging_Macro_Program_augmented_QAKBs_research.pdf
145.1 KB
We explore a novel approach to enhancing voice-operated personal assistants by integrating a lightweight text processing language, TEA, into our existing Question-Answer Knowledge Bases (QAKBs). This allows for dynamic, context-aware responses and multi-turn interactions, paving the way for smarter and more adaptable AI assistants. I invite you to read the abstract paper and share your thoughts! Also, looking forward to getting some support or a research grant to help further work we started years ago finally bringing together the VOSA and TEA language projects!
Forwarded from UGANDA
β2024 has been a Year of Hard Work, 2025 We Hope to Harvest Fruits of that work, and 2026 A Start of Whole New Future for UGANDA and UGANDANs everywhere. Greetings from IP, to all netizens and citizens across all platforms, levels, communities and jurisdictions. HAPPY New Year to U!β --- Joseph L. Willrich Cwa Mukama R.W. on behalf of UGANDA's core Internet Community (UIC).