My Gripes with Prolog
28 points by hwayne
28 points by hwayne
prolog, the language that makes parsers easy to write, and string handling impossible
you may enjoy exploring languages like griswold's icon, where goal directed evaluation remains, but it's far more imperative in flavour!
As a Prolog enthusiast and occasional Scryer Prolog contributor, those are all fair points. Many of them have been corrected in specific dialects of Prolog, but there's no consensus to bring to the ISO standard (which I don't know if you are aware, but last year had an update to add DCGs!). Strings, data types, functions and even the sort/2 predicate have bite me in the past.
Regarding cuts, I've seen that many problems can be solved by using reified predicates and dif/2. dif/2 was present in the earlier versions of Prolog, but was replaced by the current /= later for performance purposes probably, but the Prolog designers already knew it wasn't as clean as dif/2. Reified predicates on the other hand allow us to express many more things in a clear, monotonic and bidirectional way but it's like programming in a higher layer and it is more verbose than "obvious" Prolog.
I believe foo(A, B) is failing when called without instantiations because A = B will definitely succeed in all cases except where A and B are already ground terms with different values. The only time A = B at the query prompt will fail is if A and B are already instantiated to different values. What you want here is dif(A, B) instead of \+ (A = B), which is non-standard but I think most Prolog's provide it. Notice that the query dif(A, B), foo(A, B) will succeed.
Remember that =/2 in Prolog is neither assignment nor an equality test, it is unification. It acts like assignment and equality under many ordinary-looking circumstances but in fact it is sort of the whole engine of Prolog in general.
Also you might enjoy using findall/3 instead of bagof/3 since it will only ever produce one result. I consider bagof/3 and setof/3 to be more useful because you can do things analogous to SQL GROUP BY but they are harder to use correctly than findall/3 because of this.
Prolog is old and has a lot of warts; the ones you enumerated are real, but (to me) they are just historical stumbling blocks we all trip over at the beginning.
that's my reading of the code too, two unbound variables X=Y always unifies
somewhat aside: i think a lot of these problems come from going against "the grain" of prolog, not to say "you're holding it wrong" but prolog is not an easy to hold language.
i.e, if you still think in terms of "this gets executed, then this gets executed" it's very much an imperative style, and although that works for some aspects (dcgs, parsing), a mindset of "does this match? this is matched next..." can be a bit more illuminating
like, append(_, [X], [1,2,3]) isn't so much 'reversable computation' but a query "what would i append to get [1,2,3]" and the answer is X = 3
This ties to my biggest gripe with Prolog:
It looks like a declarative language, but you need to understand the behaviour of SLD derivation to ensure that a trivial-looking predicate will terminate, and a different search strategy requires a load of cuts to try to shoehorn it into SLD derivation’s depth-first strategy.
I think the most important lesson that proof assistants learned from Prolog is that there is no good one-size-fits-all tactic and separating tactics from the logic system is much better than trying to combine them.
A metainterpreter is often an easier way to achieve breadth-first search in Prolog, which sounds scary but due to its homoiconicity and few primitives, it's usually not an imposing amount of work to achieve for some subset of your program. There's a tutorial about this at https://www.metalevel.at/acomip/ under "Changing the search strategy."
In general, I think of cuts as a code smell, trying to impose an imperative order on something that should be expressed relationally. I use once/1 periodically when I want to get semi-determinism but otherwise I try to avoid them.
One definitely needs to get a feel for Prolog's engine of computation to use it productively. IMO it is a much simpler concept of computation than (say) bash or C. However, it's also totally unique, and it's frustrating to learn if you already know other languages because even when it looks similar to other systems the behavior is totally different.
The Prolog derivative I've used most recently is Rego, which is designed to write policies over JSON data. It adds some JSON types and Python-like list comprehensions to a Prolog-subset core. It's quite nice for this use case, because your policies can be written declaratively, but the lack of memoisation means that you can easily write accidentally quadratic (or cubic) code. It really misses something like Prolog's assert, but I found that quite clunky to use even in Prolog (and I have written some non-trivial Prolog, though not for a bit over 20 years).
Asserts are not widely used in modern Prolog either, but many engines now have tabling built-in which lets you get memoization for cheap. In practice, Prolog is usually indexing on the first term, and many engines let you declare indexes on other terms on a predicate-by-predicate basis, so the pain of quadratic code is lessened by the fact that the fact store is in memory and indexed. In general, Prolog is not designed for performance but like many things from the 70s and 80s, performance was a necessary research area and it can do well despite being high level, and there are special implementations that prioritize it more than others (B-Prolog, YAP, etc.). SWI, for instance, is not considered a particularly fast implementation, but it's the most featureful. When I last looked at this years ago, GNU usually outperformed it.
the classical way to have another search strategy is to build an evaluator but aiui modern prologs do support tabling at least
My own impression while learning prolog is that it always felt like writing prolog was much more about simulating a bizarre machine than any other programming language. Between modal predicates and cuts, you have to know what’s being executed, it just happens to be that none of that is evident in the syntax.
I love Prolog, but it is certainly one of the higher learning curve languages out there. I’ve unfortunately passed far enough along now that I don’t even understand what the confusion is in the author’s “wtf” example 😅
With the “predicates not evaluating their arguments” issue, I don’t think it has to do with bidirectionality, but the ability to create ad-hoc data structures; in Prolog X is Y + 1 is just a term, canonically rendered as is(X, +(Y, 1)); arbitrary predicates have no way of knowing whether you want to pass in code to evaluate or data to work on.
In any case, I think Prolog can be a fantastically fun language to use once you “get it”, the extra lines needed to do that sort of X + 1 stuff that could be inline in most languages is more than made up for by hardly needing control structures and rarely having more than two levels of indentation. I never have tried worry about ISO portability though; I just go with SWI-Prolog (which is very easy to contribute to!)
His confusion is from reading "\+ (A = B)" as "A and B are different," but as you know, the right way to phrase this is dif(A, B).
To be fair to myself the \+ (A = B) was a minimal example. The usual place negation trips me up is when I'm writing something like this:
user_can_access(U, R) :-
\+ user_denied(U, R),
resource(R),
attached_policy(U, P),
allows(P, R).
This isn't solvable by dif; I have to move the negation to the end for queries to work properly.
It's true that this example is not addressed by dif/2. If U and R are ground when this is invoked it should behave like you expect. If one or both of the bindings are established by something in the rule body, then yes, the negation will need to be moved to the end. For instance, if you wanted to enumerate resources that a user U can access, or if you wanted to see what users could access a resource R, you would need to do that, since (presumably) resource/1 and attached_policy/2 will generate resources and policies. If necessary, you can create different versions of a predicate using guards like var/1 and ground/1 to address ordering issues, but since learning about those guards I haven't actually needed to use them. But maybe I would if I did more Prolog programming.
The fact that ordering of clauses matters at all is considered a problem by some (the authors of Mercury, for instance) and a concession for computability by others (Prolog implementers I would imagine). It doesn't matter as much in ASP either for that matter.
There’s also freeze/2 or when/2 (in SWI at least) which can be used to defer goals until the arguments are sufficiently instantiated.
@hwayne have you tried other languages in that space, like Mercury, LambdaProlog, Curry, that sort of thing? I'd be curious to hear your thoughts besides Picat!
Mercury is on my list, but right now my main objective is to understand Clingo and answer set programming.
IMO the "value per dollar" of learning ASP is way higher than Mercury or Curry, possibly even Prolog. My favorite thing about ASP is that it is incredibly good at expressing exceptional situations. For instance, I made a secret santa program for my relatives this last year, and using ASP made it basically trivial to encode rules like "nobody should get their spouse or child" and "don't pair Susan with Doug." You can write this program trivially in many languages but keeping the algorithm clear while handling these exceptions is not easy.
I strongly recommend the book "Answer Set Programming," I could not have learned it without it.
I always imagine Prolog is a good backend for computer science metanotation, and then I read posts such as these and it scares me off...
One thing that surprised me when I was learning Prolog is that the "database" is basically one big global namespace. I think serious implementations like SWI bolt on some sort of module system, but they feel a bit of an afterthought.
Yes, Prolog is different. I can recommend O'Keefe's "The Craft of Prolog" for a more pragmatic and more hands-on approach. It takes some time to bend your mind into how it all works (like with all interesting programming languages) and I personally struggled a lot until I saw the light, but in the end it's worth the effort. Perhaps surprisingly, Prolog code is often more compact than the equivalent in a procedural or functional language, once you learn to utilize the resolution mechanism in expressing your control flow.
I find the mental model of logic programming fascinating, but I don't think Prolog is the best manifestation of that model. As mentioned in the post, Picat code is even more compact than Prolog, because it has native maps, strings, and functions.
Without doubt, features that appeal because they are familar can make the transition easier. Yet I find that in my own Prolog coding I don't have much need for them. For example, I like strings as lists of character codes, removing a basically redundant type (you can already do all you need with list processing and unification) which I find much more elegant (performance issues aside). In my experience, once I embraced the full Prolog way of doing things (char lists, the well placed cut here and there, and, yes, even assert/retract), I never felt much need for these "familiar" features.