1 |
The practicable of optimization basically comes down to measurement. Unless an algorithm is doing something totally bogus (such as sorting by randomizing elements and then seeing if they are sorted), it could be the most efficient method for that application. Since you brought up profiling, it seems you are already on the right track.
|
1 |
The practicable of optimization basically comes down to measurement. Unless an algorithm is doing something totally bogus (such as sorting by randomizing elements and then seeing if they are sorted), it could be the most efficient method for that application. Since you brought up profiling, it seems you are already on the right track.
|
3 |
Incidentally,
a
major
caveat
with
asymptotic
analysis
(
ex.
Big
O/ThetaOmega
notation)
is
that
it
is
meant
to
be
used
with
large
data
sets,
like
sorting
1,
000,
000
elements
in
a
list.
When
dealing
with
only
a
few
hundred
elements
and
operations
on
them,
the
lower-order
terms
and
coefficients
that
asymptotic
analysis
discards
tend
to
become
relevant.
At
that
level
it
is
better
to
simply
count
the
costs
directly.
Of
course,
it
is
best
to
measure,
though
if
there
is
an
easy
analytic
choice
between
algorithms,
go
with
the
faster
one.
|
3 |
Incidentally,
a
major
caveat
with
asymptotic
analysis
(
ex.
Big
O/ThetaOmega
notation)
is
that
it
is
meant
to
be
used
with
large
data
sets,
like
sorting
1,
000,
000
elements
in
a
list.
When
dealing
with
only
a
few
hundred
elements
and
operations
on
them,
the
lower-order
terms
and
coefficients
that
asymptotic
analysis
discards
tend
to
become
relevant.
At
that
level
it
is
better
to
simply
count
the
costs
directly.
Of
course,
it
is
best
to
measure,
though
if
there
is
an
easy
analytic
choice
between
algorithms,
go
with
the
faster
one
to
start.
|