Notes

Notes, etc. of Samuel Flint

Programming Language Concepts

Lecture 1

Administrivia

  • Uses hand-in for assignments

  • Canvas is used for announcements

  • four languages/techniques

    • antlr

    • javascript

    • haskell

    • prolog

  • Class uses iclicker channel AC

Lecture 2

Intro

  • not about popular languages

  • why not just java

    • no low-level stuff

  • Electronic submission of assignments

    • PDFS with highligting and annotations

    • using webgrader

  • Academic honesty

    • fairly normal

    • discuss above the level of psuedocode

    • permitted diagrams can be found in 15.1, 15.2, 16.1 of the textbook

  • clicker questions

    • discuss problems and different views of problems

  • Ex, Matlab

    • used in engineering

    • interpreted, proprietary, GNU Octave

    • input and source program go in at the same time

    • use the strengths of languages, in matlab, matrices are your friend

The start

  • lexical/syntax analysis

    • translate sourcecode into machine code or other source code

      • rules about what to do expect

    • char stream to scanner to token stream to parser to parse tree to semantic analysis to ast or ir to optimization to ir to final output

    • syntax rules provide formatting, allows both humans and computers to process and create information

  • scanners recognize regular languages, break input into tokens, like regular expressions

  • parsers recognize context-free grammars

Lecture 3

More scanners

  • regexes for regular languages

  • NFA can go to different states at varying times

    • no input with an $\epsilon$

    • states can be shrunk by fiding removable $\epsilon$ transitions, and thus converted to a dfa

  • DFA can be simplified, but must be done carefully

  • a simplified DFA can be converted to a lookup table

Context Free Grammars

  • allow you to recognize paired parens, keep track of information and state

  • grammars are terminals and nonterminals

  • lhs has only non-terminals

  • terminals and non-terminals on rhs

  • BNF most common notation for CFGs

    • no kleene star

  • EBNF allows for kleene start or plus

  • start with a start symbol (production)

  • continue using valid productions based on tokens

  • end when all non-terminals are resolved

  • LL left to left, left most derivation (noting you needed to make a specific turn), top down, requires look-ahead, break ties on the left

  • LR left to right, right-most derivation (noting what turns you need to make next), bottom up, break ties on right

  • will form parse trees

Lecture 4

More Scanning and Parsing

  • LL Grammars, Deriving from the left

    • top down

    • predicts production to use

    • number of lookaheads required defines what classification of grammar

  • LR grammars, Right Most derivation

    • bottom-up

    • recognize which production to use

Ambiguity

  • fixdng ambiguity can be done by adding new rules, or modifying other ones

  • antlr syntax isn't that bad

Lecture 5

Semantic Analysis

  • syntax does not equal semantics

  • syntax is form of a valid program

  • semantics are the meaning of a valid program

  • syntax is recognized by a CFG

  • semantics are recognized by an attribute grammar

  • semantics make sure input is meaningful

  • static semantics are enforced at compile time

  • dynamic semantics are checked at run time

  • parse tree can be decorated (annotated) with attributes

  • some dynamic checks can be slow, some free

    • div by zero can be done in hardware

    • pointers can be more complicated

  • use of assertions, pre/post conditions and invariants

  • static analysis

    • compile time prediction of behavior

    • precise analysis can determine if a program will always follow rules

      • hakell's hindley-milner type checker

    • generates code for what is unknown at compile time

    • determine what is safe to do (thread safety for instance)

  • attribute grammars

    • add meaning to a parse tree

    • can only use local information

    • annotation is the process of evaluating attributes

    • flow is bottom up

    • all s attribute grammars ar l attribute grammars

Lecture 6: Semantic Analysis, Continued

  • review semantics of stuff, review types of attribute grammars

  • oblivious method – doesn't use any special knowledege

  • dynamic, use topological sort followed by rule invocation

  • antlr is to cfgs what other tools are to attribute grammars

  • most good compilers use hand-written ad-hoc translation schemes

  • translation does not require full parse tree

Lecture 7: Decorating A Syntax Tree and Control Flow

tree grammars

  • represent the possible structure of syntax trees

  • a production is a possible relationship between a parent and its childrent

  • 'A : B' is read as a is a veriant of b

  • CFG versus tree grammar

    • results

      • strings of tokens

      • trees

  • root can have all static semantic problems

  • often perform type checking

  • errors flow into child nodes and back to root

Control Flow

  • does a reference store location or value

    • l-values denote locations

    • r-values denote values

  • value model, most things are l-values

    • lets you do pointers to pointers to pointers to…

  • reference models connect boxes to other things

    • let's you have evry variable as an l-value, dereferencem to get an r-value

  • java has value model for built-ins, reference for udts

  • Boxing is ude for wrapping objects into a class, could be implicit or explicit

  • assignment may use different symbols for assignment and comparison

  • initialization

    • static variables

      • consideration at compile time

    • aggregates

      • structured values, user-defined types

    • deallocation

    • dynamic semantic errors

      • ex divide by zero at runtime

    • constructors

      • there to initialize space, may not necessarily create that space

  • Ordering

    • order of operand eval

    • ordering may have side effects

    • ordering is usually undefined

Lecture 8: Antlr and Exceptions

Assignment 1

  • use antlr to decode maze data

  • has some specific ruleslexers, parsers, listeners, tokenizers from user-defined grammars

Antlr and Exceptions

  • antlr is java based, lets you build

  • scan file for valid tokens

  • parse file for structure/semantics

  • using combined grammars

    • which requires a start ruleh

  • uses listeners when nodes are visited

  • our listener will just handle errors

  • no issue with various ide, but it must run on webgrader

  • textbook covers exceptions in both control flow and subroutine contexts

  • generate a solution to a problem

  • return a status to the calling routine

  • expect the calling routine to pass error handling code

  • use exceptions

    • try – enclose code that may occur

    • catch – expect specific types of errors and handle them

    • throw – signal a specific type of error

    • finally – clean stuff up

  • try/catch goes until one is found/all are exausted or goes top to bottom inside out

  • all antlr rules are in try/catch/finally blocks

  • catch, report, recover, return

Lecture 9: Control Flow

  • preproc questions to make you think about stuff

Control Flow

  • r-values versus l values

  • evaluation is often determined by locality

  • or by mathematical identities/properties

  • c sharp has checked and unchecked qualifiers, which denote whether or not an exception is raised on overflow

  • short circuit evaluation – don't run a function if a previous condition in an and is false, or if a previous condition is true for an or

  • structured and unstructured flow

    • GOTO is unstructured

    • RETURN, Break, Continue and Exception are considered structured

    • return-from and throw/catch

    • multi-level return, with no post processing required

    • exceptions require post processing

    • continuations reset context

  • sequencing

    • compound statements and side effects

Lecture 10: Names Scopes and Bindings

Scoping

static

  • bindings are deterined at compile time

  • matching declarations whos block most closely surrounxs a given point

  • generally looks in innermost to outer most

  • declaration order varies by language

  • stuff must be declared before use, but aso must be defined before use

  • information hiding is good, but static variables can only abstract one routine

  • java's packages and cpp's namespace

dynamic

  • determined at run time, think perl, python – kinda, \TeX

  • be careful of Scoping rules, they can be weird

Lecture 11: Names, Scoping and Binding, continued

dynamic scoping

  • cons – run-time checking, run-time errors

  • pros – run-time customization

  • has a stack of bindings

  • uses last-in-scope assignment

  • eliminates passing some variables

  • think latex references

meaning within scope

  • assumed a one-to-one mapping between names and objects/functions

  • what if that isn't the case

  • and this is where we get name mangling for overloaded functions

  • coercion lets the compiler convert types

  • polymorphism is where a subroutine accepts several different uncoverted types (templates)

  • whether or not it's done at compile time and runtime

  • overloading is explicit, polymorphism is implicit

Binding rules

  • referencing to subroutines

  • bind when created or called

  • shallow binding is when called, often default

Lecture 12: Scoping and Binding, continued

Binding rules of referencing environments

  • references to subroutines

  • bind when created or called

  • shallow binding is when called, and is default for many languages

  • closures are a pointer to a subroutine and the surrounding environment

  • first class means it can be passed as a parameter, returned from a function or assigned to

  • second class values can only be assigned to

  • third class values may not have anything done to it (may not be passed as param, returned or assigned to)

  • limited extent, local object, local scope (stack)

  • unlimited extent, local object, indefinite scope (heap)

  • oo languages can be done with object-based encapsulation, or to some extent interfaces. Often called an object losure or a function object

  • macros – woot!!

  • compile-time expansion

  • text-substiutions

  • pros – no subroutine calls

  • cons – unexpected behavior

Lecture 13: Data Types

Intro

  • common considerations: ops, constructors, return types

  • limit valid operations

  • give meaning to the binary representations

    • which could be locations

    • or instructions

    • or data

Systems

  • assoc type with language constructs

  • rules for type equivalence, compatabilyt and inference

  • what can be done with functions

  • checking

    type clash

    violation of type checking rules

    strongly typed

    prohibits operations on objects of the wrong type

    statically typed

    strongly-typed at compile time, ex haskell

    dynamically typed language

    often expensive, runtime checking

    • see javascript, which is also weekly-typed

    • or matlab

  • meaning of types

    denotational

    type is a set of values, a domain

    constructive

    type is a primitive, or a collection of data (arrays, sets, etc)

    abstraction-based

    interface of possible operations

  • classifications

    numerics

    precision differences, scalars

    enumeration

    numerically valid shorthand

    • allow only discrete values

    • numerically valid

    • pascal only allows enumrated type to be compared with itself

    subrange

    compact representation of a limited range

    • can be used like documentation – seems hokey

    • can allow the compiler to check values

    • can have a compact representation

    composite

    arrays, sets, tables, etc

    records

    collections of fields

    arrays

    indexed types

    sets

    distinct elements

    pointers

    references, (l-values)

    lists

    arrays without indexing, often singly linked

    files

    a way to store any information

  • orthoganality – non-numeric indices, etc.

Type Checking

  • make sure that operations are valid

  • checking compatability versus equivalence – can they be used in the same way or are they the exact same

  • strutural equivalence, checking equivalent content recursively

  • named equivialence – checking named definitions

  • Equivalence

    • school vs student

      • maybe by name, address, age

      • or by membership

    • aliases – nicknames or distinctly different objects?

    • can require runtime checking for correspondence

  • compatibility

    • overloading and coercion

    • universal reference types

    • templates

  • inference

    • operation on two subrange values

      • ex average of grades

    • result being a subrange base type

Lecture 14: Data Types, Continued

Arrays

  • nice way of managing collections of things

  • memory layout

    • first element at address

    • where is second

    • and what about multi-dimensional

      • can be stored in row major or column major orders

      • row is addressed first in row major

        • most languages

      • column first in column major

        • fortran, matlab

    • can also be done with row pointers

    • several elements are often brought into the cache at one time

  • address calculation

    • $S_3 = b$

    • $S_2 = S_3 (U_3 - L_3 + 1)$

    • $S_1 = S_2 (U_2 - L_2 + 1)$

    • $P = a + S_1 (i - L_1) + S_2 (j - L_2) + S_3 (k - L_3) = a - (L_1 S_1 + L_2 S_2 + L_3 S_3) + L_1 i + L_2 j + L_3 k$

Strings

  • Special Case of arrays

  • have escape sequences

  • pointers to string literals or arrays of characters

Sets

  • implemented as arrays, hashtables trees

  • or characteristic arrays

    • bit i corresponds to value i being in set

    • union is bitwise-or

    • intersection is bitwise-and

    • difference is just another bitwise operation

Pointers and Recursive Types

  • often records, Employee and Manager

  • or linked data structures

  • reference model is easy

  • value model involves pointers and memory management voodoo

  • functional languages use a reference model

  • imperative language, there's often a hybrid, java being kinda weird

  • definition of a tree is recursive

  • functional structures are acyclic, but beware of mutually recursive structures

  • value model requires pointers

  • explicit definition of each element

  • pointer dereferencing

  • pointers and arrays in c

  • c and algol 68 do allow references to stack object

  • can't reset all references after reclamation

  • algol prevents refernces to object with shorte lifetimes

Garbage collection

  • ref count model (increment count when pointer refers to a memory address, decrement count when a pointer doesn't refer to the address, reclaim space when reference count is 0, but has problems with circular structures)

Lecture 15: Data Types, Continued

Recursive Types, Continued

  • how is stuff allocated? Stack or Heap

  • be careful with using reference counting

  • gc can be done with tracing

    • location reachable from current pointer

    • or mark and sweep, assumes unless

    • blah blah blah

Lists

  • can be empty, or an object followed by another list

  • are a staple of functional languages

  • lisp programs are lists

  • often car and cdr

Lecture 15: Object Orientation

  • modules

  • elements of object orientation

  • inheritance as dynamic method binding

  • encapsulation

  • abstraction as reduced conceptual load, fault containment and independence

  • code reuse through extension

  • public members are exported

  • scope resolution with :: and namespace

  • public members allows access to representation

  • overloading makes representation appear public

  • queue inherits from list

    • basic class constructor called first

    • needs

      • enqueue – add to end of list

      • dequeue – remove head of list

      • peek – get head of list

      • size – get size of list

    • choice: redefine base methods or use carefully

      • pros: gets exactly what wanted how wanted

      • cons: code reuse and hiding implementation details

Lecture 16: Object Orientation

Constructors

  • class decides visibility of members

  • derived classe can't make something more visible

  • various languages provide various visibility policies

  • sum can have inner classes, cpp's static members

  • initialization requires that a constructor finalizes space, based on the most specific class

  • calling no-arguments constructors and non-existent: static semantic error

  • copy constructors are used to convert from one class to another

  • assignments are made after initialization

Dynamic Method Binding

  • subtype, derived class without hiding

  • subtype polymorphism

  • formal versus actual parameters

  • static method binding uses method class (as called)

  • dynamic method binding uses method of class (as found)

  • static method prevents derived class from self-control

  • interfaces are used for virtual methods and abstract methods and classes

  • interfaces are also classes with only abstract methods

  • virtual method tables track an object's class

  • a derived class copies table enties from the base class and adds to it

  • polymorphism

    • do we need generics if we have dmb and polymorphism?

    • you have to make classes distinct, requiring extra code

    • generics can abstract over unrelated types

  • some languages allow for multiple inheritance, some don't and some have weird issues

  • some languages are closer to pure object oriented languages (Ruby, Eiffel), others are not (cpp close to c)

Lecture 17: Functional Language

  • lisp/scheme vs haskel

  • output is a function of input

  • side effects or internal state are not kept

  • lists of inputs and list operators

  • first-class functions

  • higher-order functions

  • defines what something is

  • Turing, Church, Kleene and Post

  • Turing Machine gave imperative

  • Kleene and Post gave Abstract

  • Church have the Lambda Calculus, which gave functional programming

    • see Church Turing Thesis

  • Constructive Vs. Non-Constructive Proof

  • lisp used for sybmolic data

  • many functions, and great with recursion, lots of temporary variables and gc

  • often have defined evaluation order and higher-order functions

Lisp and Scheme

  • scheme is-a lisp

  • programs and data stored as lists

  • can make it possible to interpret lisp programs in lisp

  • allows for self-modifying code

  • read-eval-print interaction

  • bindings

  • lists

  • various equality functions

Haskell

  • has some very good resources

  • does a lot with filtering and pattern matching

  • and recursion

  quicksort :: Ord a => [a] -> [a]
  quicksort :: [] = []
  quicksort (p:rest) = (quicksort leqs) ++ [p] ++ (quicksort gts)
    where leqs = [x | x <- rest, x <= p]
          gts = [y | y<-rest, y > p]

Eval Order Revisted

  • application order – eval function arguments before passing

  • normal-order – eval function arguments after passing

  • expression types – functions and special forms

  • functions – pass by sharing and application-order

  • special forms – pass by name and normal-order

  • param passing

    • call by value

    • call by reference

      • most langs re an l-value

      • fortran however creates a temp variable

    • call by value/result – changes are reflected after return

    • call by sharing – actual/formal parameters refer to same object

      • user-defined values in java

Lecture 18: Functional Languages, Continued

Application Order, continued

  • application order – eval before passing

  • normal order – eval after passing or as required by the form

  • functions and special forms

  • functions are pass by shairing and application order eval

  • special forms are pass by name and normal order eval

  • Const in cpp, aliases in cpp

  • using stuff and not allowing it to be changed without copying, use const to prevent change

Strict and Lazy Eval

  • strict – undefined behavior for undefined arguments, eval order not important

  • Non-strict – Eh, who cares

  • Strict, languagesonly have strict functions, application order implies strict

  • scheme is strict, haskell is not

  • use memos for eval, not ealing the same thing twice

  • haskell uses lazy eval

  • scheme and lisp is optional

  • haskell has no side effects

  • in scheme and lisps, the programmer manages side effects

I/O

  • A stream is infinite data

  • Monads are used to pass currennt state to a function and get the next state back

  • I/O Monads are partially hidden

Higher-order

  • mapping and currying

  • currying allows for partial application

Lecture 19: Haskell and Functional Programing

  • Just learn the language and learn to think recursively

  import Data.List

  dropn :: (Integral a) => a -> [t] -> [t]
  dropn n [] = []
  dropn n (x:xs)
    | (n == 1) = xs
    | otherwise = dropn (n - 1) xs

  myMap :: (a -> b) -> [a] -> [b]
  myMap func [] = []
  myMap func (x:xs) = (func x):myMap func xs

  member :: Eq a  => a -> [a] -> Bool
  member x [] = False
  member match (x:xs)
    | (match == x) = True
    | otherwise = member match xs

  evenSum :: Integral a => [a] -> a
  evenSum [] = 0
  evenSum (x:xs)
    | (isEven x) = x + evenSum xs
    | otherwise = evenSum xs
    where isEven a = (a `mod` 2) == 0

  prefix :: [a] -> [a]
  prefix list = take (length list - 1) list
  find :: Eq a => a -> [a] -> [Int]
  find x list = find' x list 0 
    where find' _ [] _ = []
          find' x (h:t) n
            | (h == x) = (n+1):find' x t (n+1)
            | otherwise = find' x t (n+1)

Lecture 20: Haskell, Continued

  -- Mine
  fibonacci :: (Integral t1, Integral t) => t1 -> t
  fibonacci 0 = 0
  fibonacci n = fibonacci' n 0 1
    where fibonacci' 0 a _ = a
          fibonacci' 1 _ b = b
          fibonacci' n a b = fibonacci' (n - 1) b (a + b)

  -- RyPat's
  fibTuple :: (Integer, Integer, Integer) -> (Integer, Integer, Integer)
  fibTuple (a, b, 0) = (a, b, 0)
  fibTuple (x, y, n) = (fxp1, (fx + fxp1), n)
    where (fx, fxp1, _) = fibTuple (x, y, (n - 1))

  fibTupleBasedFib :: Integer -> Integer
  fibTupleBasedFib n = l
    where (l, _, _) = fibTuple (0, 1, n)
  import Data.List

  stopAtVowel [] = []
  stopAtVowel (x:xs)
    | (elem x "aeiouAEIOU") = (x:xs)
    | otherwise = stopAtVowel xs

Lecture 21: Functional Programming and Haskell, Continued

Types

  • built in types

  • custom types with data

  • ex data Shape = Cirle Float Float Float | Rectangle Float Float Float Float

    ~surface

    Shape -> Float~

  • Type synonyms are a thing, giving names to compound types

Doing imperative stuff in functional languages

  • loop by recursion

  • or use stuff like fold/map

  ourPower :: (Integral a, Num b) => a -> b -> b
  ourPower n = (^ n)
  import Data.List

  findInMatrix :: Eq a, Integral b => a -> [[a]] -> [(b, b)]
  findInMatrix _ [] = []
  findInMatrix x list = findInMatrix' x list 0
    where findInMatrix' _ [] _ = []
          findInMatrix' a (h:t) n
            | (elem a h) = [(n, p) | p <- getPositions a h 0] ++ findMatrix' a t (n + 1)
            | otherwise = findMatrix' a t (n + 1)
              where getPositions _ [] _ = []
                    getPositions x (h:t) n
                      | (x == h) = n : getPositions x t (n + 1)
                      | otherwise = getPositions x t (n + 1)

Lecture 22: Logic Languages

  • only in the first-order

  • can have some limits

  • programmer sets goal, program attempts to imply goal

  • functional is what something is, logic is what's true about something

  • no input/output values, well, kinda

  myGCD(A, B, G) :- A = B, G = A.
  myGCD(A, B, G) :- A > B, C is A - B, myGCD(C, B, G).
  myGCD(A, B, G) :- B > A, C is B - A, myGCD(C, A, G).
  • horn clauses, have head and body, read $H$ if $B_1, \ldots B_n$, officialy $H \gets B_1, \ldots B_n$

  • resolution is cancelling of terms

  • unification is the process by which this is done

  • interpreter runs in a database of clauses

  • terms are constants, variables or structures

  • constants are atoms or number

  • structures are logical predicates or data structures

  • atoms are all lowercase

  • variables are uppercase and they're limited to the scope of the clause

  • type checking is very dynamic

  • functors and lists of arguments

  • clauses end with a period

  • symbols frequently used

    :-

    implication

    ,

    and

    ?-

    query

    ;

    next solution

  • variables in head of the horn clause are universally quantified

  • variables not in head are existentially quantified

  • an empty left side is the query or goal

  • order matters

  • the faster you can say no, the faster you can go

  • if $C_1$ and $C_2$ are clauses, if head of $C_1$ matechs a term in $C_2$, the body of $C_1$ can be placed in the body of $C_2$

  • unification is pattern matching

  • instantiated variables are given values after unification

    • constants unify with themself

    • structurse unify iff functor arity and args unify

    • vars unify with anything

  • equality is unifiability

  • lists are an element and a list which may be an empty list

  • arithmetic is a predicate, not a function

  • use is for arithmetic comparison

  • forward chaining is from start to finish

  • backward chaning is from finish to start

    • requires more rules than facts, forward chaining is more efficient

  • prolog uses backwards chaninig

  • terms on the right unified with heads of other clauses

  • depth-first, left-to-right

  • if the search can't continue

    • remove the last step and try an alternative

  • implemented using a single stack

  • an achieved subgoal is kept on the stack in case of backtracking

  • consider first to last

Lecture 23: Midterm Review

  • shallow binding – finds the most recent declaration in scope

  • deep binding – create the binding when the args are passed (ex, when a function argument is passed, it knows about the scope at that time)

Lecture 24: Logic Languages, Continued

  • be careful with how you structure horn clauses

  • cut (the !) operator will cut off multiple evaluation

  • loops are done with fail conditions, but use recursion, it's easier

  • consult and reconsult get predicates and data from a file

  • get will get the text printable character

  • assert and retract will add and remove clauses to programs

      dropn([], _, []).
      dropn(Before, 0, Before).
      dropn([H|T], N, After) :- length([H|T],L), between(1,L,N), K is N - 1, dropn(T, K, After).

Lecture 25: Logic Languages, Continued Again

  prefix([_], []).
  prefix([H|T], [H|PrefT]) :- prefix(T, PrefT).
  evenSum([], 0);
  evenSum([H|T], X) :- 0 is mod(H, 2), evenSum(T, SUM), X is SUM + H.
  evenSum([H|T], X) :- 1 is mod(H, 2), evenSum(T, X).
  member(Element, [Element|_]):-!.
  member(Element, [_|T]) :- member(Element, T).
  get([H|_], H, 1).
  get([_|T], E, X) :- get(T, E, N), X = 1 + N.
  fib(0, 0).
  fib(1, 1).
  fib(N, X) :- N > 1, NM1 is N - 1, fib(NM1, FNM1), NM2 = N - 2, fib(NM2, FNM2), X is FNM1 + FNM2.

Lecture 26: Logic Languages, Yet Again

  fibTuple((0, 1, 0)).
  fibTuple((FibN, FibNP1, N)) :-
      length(_, N),
      N > 0,
      NM1 is N - 1,
      fibTuple((FibNM1, FibN, NM1)),
      FibNP1 is FibNM1 + FibN.

  fib(N, F) :- fibTuple((F, _, N)).

Lecture 27: Still more Logic Languages: Searching techniques

  path(X,X).
  path(X,Y):-
      edge(Z,Y),
      path(X,Z).

  edge(a,b).
  edge(b,c).
  edge(b,d).
  edge(b,e).
  edge(c,f).
  edge(e,b).
  edge(e,f).
  edge(e,g).
  edge(f,c).
  edge(f,h).
  edge(g,h).
  edge(g,j).
  edge(h,k).
  edge(i,g).
  edge(j,i).
  edge(k,l).
  edge(l,j).

  dfs_smart(From, From, _, [From]).
  dfs_smart(From, To, Visited,[From|Rest]) :-
      edge(From, Next),
      \+ member(Next, Visited),
      dfs_smart(Next, To, [Next|Visited], Rest).

  bfs(From, From, [From]).
  bfs(From, To, [From|Rest]) :-
      length(Rest, _),
      edge(From, Next),
      bfs(Next, To, Rest).

  bfs_smart(From, From, _, [From]).
  bfs_smart(From, To, Visited, [From|Rest]) :-
      length(Rest, _),
      edge(From, Next),
      \+ member(Next, Visited),
      bfs_smart(Next, To, [Next|Visited], Rest).
  change([HB|TB], 0, NewWhat, [NewWhat|TB]).
  change([HB|TB], N, NewWhat, [HB|TA]) :-
      NM1 is N - 1,
      change(TB, NM1, NewWhat, TA).

Lecture 28: Yet more logic languages

  • still more searches

  • remember findall – will try to find everything

  • bagof will look for all options, but treats it like a set, unless identified specifically

  • setof will remove duplicates. period. and sorts alphabetically

  • remember = is literal, is is not

Lecture 29: Scripting Languages

  • used for glue

  • shell

  • report generation

  • multipurpose

  • web

  • automate mundane tasks

  • possible in java, c, cpp, others, but can be difficult

  • scripting languages are often useful for fast development of quick stuff – have less overhead

  • batch v interactive

  • simplicity

  • scoping can be weird

  • data types are very fluid

  • system access is incredible

  • often has pattern matching

  • shell

  • text processing/report generation

  • math and statistics

  • extensions

  • forms

Leccture 30: Concurrency

  • thinking about concurrency

    • can be used for some logical structures

    • available recousers

    • multiple devices

  • concurrency – two or more tasks at the same time (in unpredicable state)

  • parallel – more than one task physically active

  • distributed – physically spearated processors

  • single processor

  • gpu

  • multicore

  • simd – single instruction, mlti data

    • vector processing

    • unwinding loops

  • mimd – multi instruction, multi data

    • multiprocessors

  • can be done for speed, because of the forumaliton of solutions or distribution

  • issues with race conditions, can be fixed with mutexes

  • threads

  • dispatch loops can be used for a predicatble sequence of execution

  • message passing – ystem buss for multi processor architectures, involves shared memory with structured calls

  • communication, synchronization are important

  • languages, libraries and standards change things

Lecture 31: Concurrency, continued

  • heavyweight and lightweight processes

  • co-begin – start several processes at the same time

  • threads are implemented with either schedulers (which handle all save/restore) or cooperative multithreading

  • atomic ops are small enough that it just happens

  • conditional sync waits for preconds

  • too much sync is not enough parallelism

  • busy-wait sync – spin-lock, needs constant time/space, time proportional to n threads is insufficient, back off, is fairly fair

  • be careful of barriers and non-blocking threads

  • monitors are semaphores with management

Lecture 32: Prolog stuff

  • Just a bunch of review

Lecture 33: Continued Prolog

  • remember to be careful of order of predicates – the faster you say no, the faster you go

Lecture 34: More prolog and assignment review, plus some stuff about the final

  • be careful with the searching

Lecture 35: Review for the final - Prolog

  • be careful with tracing and going backward

Lecture 36: Review for the final

  • just be careful with prolog

Lecture 37: More Final Review

  • prolog is going to be horrendous