Flow Matching Examples¶

A study on FlowMatching based on Meta's "Flow Matching Guide and Code" study published by Meta.

See also "An Introduction to Flow Matching" by the Cambridge Machine Learning Group.

Imports and Global Variables¶

  • Also we redirect some projects to the local cloned versions, using go work.
In [1]:
!*rm -f go.work && go work init && go work use . "${HOME}/Projects/gomlx" "${HOME}/Projects/gopjrt" "${HOME}/Projects/gonb"
%goworkfix
	- Added replace rule for module "github.com/gomlx/gomlx" to local directory "/home/janpf/Projects/gomlx".
	- Added replace rule for module "github.com/janpfeifer/gonb" to local directory "/home/janpf/Projects/gonb".
	- Added replace rule for module "github.com/gomlx/gopjrt" to local directory "/home/janpf/Projects/gopjrt".
In [2]:
import (
    "bytes"
    "flag"
    colors "image/color"
    "github.com/gomlx/gomlx/backends"
    _ "github.com/gomlx/gomlx/backends/default"
    fm "github.com/gomlx/gomlx/examples/FlowMatching"
    . "github.com/gomlx/gomlx/pkg/core/graph"
    "github.com/gomlx/gomlx/pkg/ml/context"
    "github.com/janpfeifer/gonb/gonbui"
    "github.com/janpfeifer/must"
    "gonum.org/v1/plot"
    "gonum.org/v1/plot/plotter"
    plotvg "gonum.org/v1/plot/vg"
)

var (
    backend = backends.MustNew()
    _ *Node = nil
)

Plotting A Histogram with gonum.org/v1/plot¶

We define the HistogramXYs function to plot a histogram of a distribution of (X, Y) coordinates.

In [3]:
var Blue = colors.RGBA{0, 0, 0xFF, 0xFF}

// HistogramXYs returns a SVG plot as a string.
// xys must be shaped [numPoints, 2].
func HistogramXYs(title string, xys [][]float32, width, height int) string {
    p := plot.New()
    p.Title.Text = title
    plotter.DefaultLineStyle.Width = plotvg.Points(1)
	plotter.DefaultGlyphStyle.Radius = plotvg.Points(1.5)
    plotter.DefaultGlyphStyle.Color = Blue
    
    pts := make(plotter.XYs, len(xys)+2)
    for ii, xy := range xys {
        pts[ii].X, pts[ii].Y = float64(xy[0]), float64(xy[1])
    }

    // Set the limits of the histogram with 2 fake points.
    ii := len(xys)
    pts[ii].X, pts[ii].Y = -3, -3
    ii++
    pts[ii].X, pts[ii].Y = 3, 3
    scatter := must.M1(plotter.NewScatter(pts))
    p.Add(scatter)
    
    writer := must.M1(p.WriterTo(plotvg.Points(float64(width)), plotvg.Points(float64(height)), "svg"))
    var buf = &bytes.Buffer{}
    writer.WriteTo(buf)
    return buf.String()
}

Section 2: Quick tour and key concepts / Code 1¶

This is the GoMLX version of the Code 1 in the paper, originally written in PyTorch.

But first, let's start plotting our source $p_{t=0}(X)$ and target $q(X) = p_{t=1}(X)$ distributions:

  • For these distributions, $X \in \mathbb{R}^2$, these are points (x,y).
  • The source disbritution $p_{t=0}$ is a pure 2D gaussian, $x$ and $y$ completely uncorrelated.
  • The target distribution $p_{t=1}$ is a "two moons" distribution, defined in fm.MakeMoons
  • Our goal is to find a model that transforms $p_{t=0}$ to $p_{t=1}$, so by simply sampling gaussian points, we get the two moons. Getting this right, allows later to generate images from random noise.
In [4]:
%%
numPoints := 200
ctx := context.New()
normalPoints := context.MustExecOnce(backend, ctx, func (ctx *context.Context, g *Graph) *Node {
        return ctx.RandomNormal(g, shapes.Make(dtypes.F32, numPoints, 2))
    }).Value().([][]float32)
moonsPoints := context.MustExecOnce(backend, ctx, func (ctx *context.Context, g *Graph) *Node {
        return fm.MakeMoons(ctx, g, numPoints)
    }).Value().([][]float32)
gonbui.DisplayHTMLF("<table><tr><td>%s</td><td>%s</td></tr></table>",
    HistogramXYs("Source Distribution: Normal", normalPoints, 200, 200),
    HistogramXYs("Target Distribution: Moons", moonsPoints, 200, 200))
Source Distribution: Normal -3 0 3 -3 0 3 Target Distribution: Moons -3 0 3 -3 0 3

FlowMatching implicitly it attempting to learn a function $\psi(X_0, t)$ that given an input at $t=0$ and the target $t$, would return the transformed input. But instead of learning it directly, it learns its gradient $\frac{d}{dt}\psi(X_0, t)(t)$ for any given $0 <= t <= 1$.

Let's reparametrize $\frac{d}{dt}\psi(X_0, t)$ as the function $u(X_t, t) = u(\psi(X_0, t), t) = \frac{d}{dt}\psi(X_0, t)(t)$. Important to notice that we want $u(X_t, t)$ to be a function of $X_t$ and not $X_0$, because we will use $X_t$ as input to our model.

Once we learn $u(X_t, t)$, we can transform any $X_0$ by integrating using $u(X_t, t)$: we split the time in slices and do small linear steps on the direction of the gradient $u(X_t, t)$.

We define the following symbolic (graph building) Go functions:

  • u(ctx, xyT, t): ctx is the context with the variables of the model; xyT is our input $X_t$ (notice $X$ is an $(x,y)$ pair); and 0 <= t <= 1.
  • step(ctx, xyT, tStart, tEnd): the step function, that moves value $X_{t_{start}}$ to $X_{t_{end}}$ taking one step using the predicted $\frac{d}{dt}\psi(X_0, t)$.
In [5]:
func u(ctx *context.Context, xyT, t *Node) *Node {
    if t.IsScalar() {
        batchSize := xyT.Shape().Dimensions[0]
        t = BroadcastToDims(t, batchSize, 1)
    }
    inputs := Concatenate([]*Node{t, xyT}, -1)
    return fnn.New(ctx.In("u"), inputs, /*num_outputs*/ 2).
        NumHiddenLayers(3, 64).
        Activation(activations.TypeGeluApprox).
        Done()
}

func step(ctx *context.Context, xyT, tStart, tEnd *Node) *Node {
    // For simplicity, using midpoint ODE solver in this example
    slope0 := u(ctx, xyT, tStart)
    ΔT := Sub(tEnd, tStart)
    halfΔT := DivScalar(ΔT, 2)
    midPoint := Add(xyT, Mul(slope0, halfΔT))
    slope1 := u(ctx, midPoint, Add(tStart, halfΔT))
    return Add(xyT, Mul(slope1, ΔT))
}
  • Simple training loop:
In [6]:
var DType = dtypes.Float32

func trainStep(ctx *context.Context, g *Graph, batchSize int, opt optimizers.Interface) {
	xy1 := fm.MakeMoons(ctx, g, batchSize)
	xy0 := ctx.RandomNormal(g, shapes.Make(DType, batchSize, 2))
	t := ctx.RandomUniform(g, shapes.Make(DType, batchSize, 1))  // 0 <= t < 1
	xyT := Add(
		Mul(OneMinus(t), xy0),
		Mul(t, xy1))
	targetSlope := Sub(xy1, xy0)  // Straight line from xy0 to xy1
	predictedSlope := u(ctx, xyT, t)
	loss := losses.MeanSquaredError([]*Node{targetSlope}, []*Node{predictedSlope})
	opt.UpdateGraph(ctx, g, loss)
}

func train(ctx *context.Context, numSteps, batchSize int) {
	opt := optimizers.Adam().LearningRate(0.01).Done()    
    trainStepExec := context.MustNewExec(backend, ctx, func(ctx *context.Context, g *Graph) {
		trainStep(ctx, g, batchSize, opt)
	})
    for _ = range numSteps {
        _ = trainStepExec.MustExec()
    }
}

%%
ctx := context.New().Checked(false)
start := time.Now()
train(ctx, 100, 256)
fmt.Printf("Training 100 steps in %s\n", time.Since(start))
Training 100 steps in 1.673204176s
  • Plotting results:

We train the model that learns the velocity of change to transform the source distribution (gaussian) to the targert distribution ("moons"). And then we "integrate" the "speed" $u(X_t, t)$, using the "midpoint method" (Wikipedia).

In [7]:
%%
ctx := context.New().Checked(false)
numTrainSteps := 10_000
batchSize := 256
start := time.Now()
train(ctx, numTrainSteps, batchSize)
fmt.Printf("Training %d steps (batchSize=%d) in %s\n", numTrainSteps, batchSize, time.Since(start))

numPoints := 100
numPlots := 9
svgPlots := make([]string, 0, numPlots)
// xyT for t=0 are normally distributed points.
xyT := context.MustExecOnce(backend, ctx, func (ctx *context.Context, g *Graph) *Node {
    return ctx.RandomNormal(g, shapes.Make(dtypes.F32, numPoints, 2))
})
stepExec := context.MustNewExec(backend, ctx, step)
for pIdx := range numPlots {
    tEnd := float32(pIdx) / float32(numPlots-1)  // From 0.0 to 1.0
    if pIdx > 0 {
        // If not the initial state, take one step forward from tStart to tEnd
        tStart := float32(pIdx-1) / float32(numPlots-1)
        xyT = stepExec.MustExec1(xyT, tStart, tEnd)
    }
    svgPlot := HistogramXYs(fmt.Sprintf("t=%.2f", tEnd), xyT.Value().([][]float32), 200, 200)
    svgPlots = append(svgPlots, svgPlot)
}

gonbui.DisplayHTMLF("<h3>Distribution shift from t=0 to t=1.0</h3>\n<table><tr><td>\n%s\n</td></tr></table>", strings.Join(svgPlots, "\n</td><td>\n"))
Training 10000 steps (batchSize=256) in 17.32541239s

Distribution shift from t=0 to t=1.0

t=0.00 -3 0 3 -3 0 3 t=0.12 -3 0 3 -3 0 3 t=0.25 -3 0 3 -3 0 3 t=0.38 -3 0 3 -3 0 3 t=0.50 -3 0 3 -3 0 3 t=0.62 -3 0 3 -3 0 3 t=0.75 -3 0 3 -3 0 3 t=0.88 -3 0 3 -3 0 3 t=1.00 -3 0 3 -3 0 3

Generative Oxford 102 Flowers Dataset using Flow Matching¶

Leveraging the code used in the Denoising Diffusion Implicit Models for the Oxford Flowers 102 Dataset demo, we do something similar, but using Flow Matching instead.

The model can be configured with hyper-parameters, which can be set with the --set=hyperparam1=value1;hyperparam2=value2;... flag, and is defined in the package fm (github.com/gomlx/gomlx/examples/FlowMatching -- in go.dev)

And there are several tools to display progress and results of a trained model in a Notebook, we demo them below.

While training can be done in the Notebook, because usually it takes 24 hours, there is also a small command-line tool to train in github.com/gomlx/gomlx/examples/FlowMatching/demo. It comes with good defaults, but go to the demo directory and do go run . --help for a list of options.

Imports, flags and download dataset¶

  • We use diffusion.Config for configuration, since most parameters (e.g.: image and batch size) are the same.
In [8]:
// Clear all state from previous cells
%reset
!*rm -f go.work && go work init && go work use . "${HOME}/Projects/gomlx" "${HOME}/Projects/gopjrt" "${HOME}/Projects/gonb"
%goworkfix
* State reset: all memorized declarations discarded.
	- Added replace rule for module "github.com/janpfeifer/gonb" to local directory "/home/janpf/Projects/gonb".
	- Added replace rule for module "github.com/gomlx/gopjrt" to local directory "/home/janpf/Projects/gopjrt".
	- Added replace rule for module "github.com/gomlx/gomlx" to local directory "/home/janpf/Projects/gomlx".
In [9]:
import (
    "github.com/gomlx/gomlx/backends"
    fm "github.com/gomlx/gomlx/examples/FlowMatching"
    flowers "github.com/gomlx/gomlx/examples/oxfordflowers102"    
    "github.com/gomlx/gomlx/examples/oxfordflowers102/diffusion"    
    . "github.com/gomlx/gomlx/pkg/core/graph"
    "github.com/gomlx/gomlx/pkg/ml/context"
    "github.com/gomlx/gomlx/pkg/ml/train"
    timage "github.com/gomlx/gomlx/pkg/core/tensors/images"
    "github.com/janpfeifer/gonb/gonbui"
    "github.com/janpfeifer/must"

    _ "github.com/gomlx/gomlx/backends/default"
)

type Dataset = train.Dataset

var (
    backend = backends.MustNew()
    _ *Node = nil

    flagDataDir    = flag.String("data", "~/work/oxfordflowers102", "Directory to cache downloaded and generated dataset files.")
	flagEval       = flag.Bool("eval", true, "Whether to evaluate the model on the validation data in the end.")
	flagVerbosity  = flag.Int("verbosity", 1, "Level of verbosity, the higher the more verbose.")
	flagCheckpoint = flag.String("checkpoint", "", "Directory save and load checkpoints from. If left empty, no checkpoints are created.")

    // settings is bound to a "-set" flag to be used to set context hyperparameters.
    settings = commandline.CreateContextSettingsFlag(fm.CreateDefaultContext(), "set")
)

func NewConfig() (config *diffusion.Config) {
    ctx := fm.CreateDefaultContext()
    paramsSet := must.M1(commandline.ParseContextSettings(ctx, *settings))
    config = diffusion.NewConfig(backend, ctx, *flagDataDir, paramsSet)
    if *flagCheckpoint != "" {
        config.AttachCheckpoint(*flagCheckpoint)
    }
    return
}

%%
cfg := NewConfig()
must.M(flowers.DownloadAndParse(cfg.DataDir))
fmt.Println("Oxford Flowers 102 dataset downloaded:")
fmt.Printf("\t%d images, %d labels, %d examples\n", len(flowers.AllImages), len(flowers.AllLabels), flowers.NumExamples)
Oxford Flowers 102 dataset downloaded:
	8189 images, 8189 labels, 8189 examples

Sample of the Oxfordflowers 102¶

To do that we create a temporry dataset (with NewDataset) of size 256x256 pixels, and then show a sample of the flowers.

Later we will use a model that uses only 64x64 pixels.

In [10]:
// sampleTable generates and outputs one html table of samples, sampling rows x cols from the images/labels provided.
func sampleTable(title string, ds train.Dataset, rows, cols int) {
    htmlRows := make([]string, 0, rows)
    for row := 0; row < rows; row++ {
        cells := make([]string, 0, cols)
        for col := 0; col < cols; col++ {
            cells = append(cells, sampleOneImage(ds))
        }
        htmlRows = append(htmlRows, fmt.Sprintf("<tr>\n\t<td>%s</td>\n</tr>", strings.Join(cells, "</td>\n\t<td>")))
    }
    htmlTable := fmt.Sprintf("<h4>%s</h4><table>%s</table>\n", title, strings.Join(htmlRows, ""))
    gonbui.DisplayHTML(htmlTable)
}

// sampleOneImage one image from tensor and returns an HTML rendered image with label
func sampleOneImage(ds train.Dataset) string {
    _, inputs, labels := must.M3(ds.Yield())
    imgTensor := inputs[0]
    img := timage.ToImage().Single(imgTensor)
    exampleNum := inputs[1].Value().(int64)
    label := labels[0].Value().(int32)
    labelStr := flowers.Names[label]
    
    imgSrc := must.M1(gonbui.EmbedImageAsPNGSrc(img))
    width, height := imgTensor.Shape().Dimensions[0], imgTensor.Shape().Dimensions[1]
    return fmt.Sprintf(`<figure style="padding:4px;text-align: center;"><img width="%d" height="%d" src="%s"><figcaption style="text-align: center;">Example %d:<br/><span>%s (%d)</span><br/>(%dx%d pixels)</figcaption></figure>`, 
                       width, height, imgSrc, exampleNum, labelStr, label, img.Bounds().Dx(), img.Bounds().Dy())
}

%% --set="image_size=256"
cfg := NewConfig()
must.M(flowers.DownloadAndParse(cfg.DataDir))
ds := flowers.NewDataset(dtypes.U8, cfg.ImageSize)
ds.Shuffle()
sampleTable("Oxford 102 Flowers Sample", ds, 4, 4)

Oxford 102 Flowers Sample

No description has been provided for this image
Example 7019:
siam tulip (38)
(256x256 pixels)
No description has been provided for this image
Example 6441:
love in the mist (32)
(256x256 pixels)
No description has been provided for this image
Example 2510:
morning glory (75)
(256x256 pixels)
No description has been provided for this image
Example 574:
cyclamen (87)
(256x256 pixels)
No description has been provided for this image
Example 6728:
toad lily (78)
(256x256 pixels)
No description has been provided for this image
Example 3373:
fritillary (22)
(256x256 pixels)
No description has been provided for this image
Example 2156:
thorn apple (74)
(256x256 pixels)
No description has been provided for this image
Example 460:
cyclamen (87)
(256x256 pixels)
No description has been provided for this image
Example 6516:
corn poppy (25)
(256x256 pixels)
No description has been provided for this image
Example 2669:
geranium (57)
(256x256 pixels)
No description has been provided for this image
Example 5452:
sunflower (53)
(256x256 pixels)
No description has been provided for this image
Example 1786:
hibiscus (82)
(256x256 pixels)
No description has been provided for this image
Example 5887:
black-eyed susan (62)
(256x256 pixels)
No description has been provided for this image
Example 819:
frangipani (80)
(256x256 pixels)
No description has been provided for this image
Example 4835:
hippeastrum (90)
(256x256 pixels)
No description has been provided for this image
Example 1500:
poinsettia (43)
(256x256 pixels)
In [19]:
// Remove cached file to force regeneratrion.
// !rm -f "${HOME}/work/oxfordflowers102/"*_cached_images_*
import "github.com/gomlx/gomlx/pkg/support/fsutil"

%% --set="image_size=128"
cfg := NewConfig()
trainDS, validationDS := cfg.CreateInMemoryDatasets()

fmt.Println()
fmt.Printf("Total number of examples: #train=%d, #validation=%d\n", trainDS.NumExamples(), validationDS.NumExamples())
fmt.Printf("trainDS (in-memory) using %s of memory.\n", fsutil.ByteCountIEC(trainDS.Memory()))
fmt.Printf("validationDS (in-memory) using %s of memory.\n", fsutil.ByteCountIEC(validationDS.Memory()))

// Output a random sample.
trainDS.Shuffle()
sampleTable("Oxford 102 Flowers Sample -- In-Memory Dataset", trainDS, 1, 4)
Total number of examples: #train=6487, #validation=1702
trainDS (in-memory) using 304.2 MiB of memory.
validationDS (in-memory) using 79.8 MiB of memory.

Oxford 102 Flowers Sample -- In-Memory Dataset

No description has been provided for this image
Example 8152:
japanese anemone (61)
(128x128 pixels)
No description has been provided for this image
Example 157:
passion flower (76)
(128x128 pixels)
No description has been provided for this image
Example 1296:
rose (73)
(128x128 pixels)
No description has been provided for this image
Example 6788:
fire lily (20)
(128x128 pixels)

Training¶

Because it can take hours, we recommend training from the command-line.

For this demos we trained two models, one for 64x64 images and another for 128x128 using github.com/gomlx/gomlx/examples/FlowMatching/demo. Where how we ran them (from the demo directory):

  1. fm_64x64: go run . --checkpoint fm_64x64
  2. fm_128x128: `go run . --checkpoint=fm_128x128 --set='image_size=128;diffusion_channels_list=32,64,96,128,160;diffusion_num_residual_blocks=6;learning_rate=1e-2;diffusion_loss=huber'

The Model¶

The model uses a U-Net model to predict the velocity of change of distribution from noise to flower $u(X_t, t)$, where $X_t$ is the image, with an extra conditioning variable, the flower id (one of the 102 available). But other than that, the way to train is exactly the same as the first example in this notebook.

The images are first "gaussian normalized" for a mean of 0 and stddev of 1 -- normalized over the whole dataset. The flowers are generated in normalized space, and then de-normalized before displaying.

Below we train from the notebook, for just 100 steps. We use this during development to test things are working.

In [11]:
!rm -rf ~/work/oxfordflowers102/fm_base
In [12]:
%% --checkpoint fm_base --set=train_steps=100;plots=false
cfg := NewConfig()
fm.TrainModel(cfg, *flagCheckpoint, *flagEval, *flagVerbosity)
Backend "stablehlo":	stablehlo:cuda - PJRT "cuda" plugin (/home/janpf/.local/lib/gomlx/pjrt/pjrt_c_api_cuda_plugin.so) v0.76 [StableHLO]
	train_steps=100
	plots=false
Starting training stage:
       100% [========================================] (26 steps/s) [step=99] [loss+=0.746] [~loss+=1.21] [~loss=1.21]                  
	[Step 100] median train step: 36501 microseconds

Results on train:
	Mean Loss+Regularization (#loss+): 0.692
	Mean Loss (#loss): 0.692
Results on validation:
	Mean Loss+Regularization (#loss+): 0.696
	Mean Loss (#loss): 0.696

Training Progression¶

By default the training library in fm.TrainModel will pre-generate a fixed set of random noise at the start of the training, and every now and then tranform that noise into a sampled image. This we way we can see how the generative model is progressing.

The function fm.PlotModelEvolution can display a subset of those sampled images, along with the global_step of the model at the time the image was generated. Set animation=true to display it as an animation (instead of a list).

In [13]:
%% --checkpoint fm128x128
cfg := NewConfig()
fm.PlotModelEvolution(cfg, 8, /*animate=*/ true)

Generated samples in /home/janpf/work/oxfordflowers102/fm128x128:

Sample Flowers¶

This is how we sample flowers from the trainined model. fm.DisplayImagesAcrossTime takes the number of images to generate, how many steps to generate the image -- the more the better, but it is slower -- and if to display some intermediary images as the final image is being generated.

In [14]:
%% --checkpoint fm128x128
cfg := NewConfig()
fm.DisplayImagesAcrossTime(cfg, /*numImages*/ 4, /* numSteps */ 20, /*displayEveryNSteps*/ 10)
DisplayImagesAcrossDiffusionSteps(4 images, 20 steps): noise.shape=(Float32)[4 128 128 3]
	Model #params:	39155748
	 Model memory:	149.4 MiB

0.00% Transformed

No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image

52.63% Transformed

No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image

100.00% Transformed

No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image

Sample Per Flower Type¶

In [15]:
%% --checkpoint=fm128x128
cfg := NewConfig()
for ii := 0; ii < 5; ii++ {
    flowerType := int32(rand.Intn(flowers.NumLabels))
    gonbui.DisplayHTML(fmt.Sprintf("<p>Generated <b>%s</b></p>\n", flowers.Names[flowerType]))
    fm.PlotImagesTensor(fm.GenerateImagesOfFlowerType(cfg, 9, flowerType, 30))
}

Generated barbeton daisy

No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image

Generated foxglove

No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image

Generated buttercup

No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image

Generated sunflower

No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image

Generated siam tulip

No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image No description has been provided for this image

With Widgets¶

In [ ]:
%% --checkpoint fm128x128
cfg := NewConfig()

// Create UI with diffusion generated flowers.
divId := dom.CreateTransientDiv()
doneSteps := fm.SliderDiffusionSteps(cfg, "slider_diffusion_steps", 8, 30, divId)
doneFlowers := fm.DropdownFlowerTypes(cfg, "dropdown_flower_types", 8, 20, divId)

// Wait for OK button.
button := widgets.Button("Done").AppendTo(divId).Done()
<-button.Listen().C

// Clean up and persist HTML (so it can be saved).
doneSteps.Trigger()
doneFlowers.Trigger()
dom.Persist(divId)
gonbui.DisplayHTML("Finished.")