Marty's 2017 Software Dairy
Table of Contents
1 January 2017
1.1 Tuesday, 3rd
1.2 Thursday, 19th
My two frontiers on the software front, since the last entry:
- my word-counting exercise, mostly complete, but still a bit of hand-holding at the last steps.
- i guess i should include work on spreadsheets as well in the "software work". lots of tuning on the investment tools. I shoudl be describing those efforts in the
i'm now taking advantage of emacs bookmarks, they way they should be used, i think. e.g. i just planted swdiary here. I guess the trick is bookmarking every time you finish editing a piece. What I used to do was plant the text: EDIT_MARK as a comment at place to return. Now it's simply:
C-x r m { and give it a name the first time in a file }
The idea of the day is to use the OrgMode tag as the organizing principle to index my Notes on my Commonplace book.
Now to mop up on #1.
1.3 Friday, 27th
The big news today. A yuge breakthru with HTML written with m4 macros. The paper now is here with my New M4 Definitions for HTML.
2 February 2017
2.1 Thursday, 9th qff fapi
This is a big day in function history. I've really cranked up the value of three functions:
2.1.1 qff – Quick Function Fix editor commandline balance vix
I hadn't used qff routinely, if even then since November. Here's a usage table:
31 2016-08-03 31 1 2016-08-25 32 2 2016-08-27 34 1 2016-08-28 35 1 2016-09-04 36 2 2016-09-15 38 2 2016-09-18 40 1 2016-09-19 41 8 2016-09-25 49 4 2016-09-29 53 1 2016-09-30 54 1 2016-10-02 55 4 2016-10-27 59 1 2016-11-03 60 2 2016-11-18 62 19 2017-02-09 81
Re-discovering the function has enabled me to strike the necessary balance between working from the command line, and within the friendly confines of emacs.
Reviewing the code, it's pretty bare-bones, using command, and quietly from the common list, and whf, pick, vix and myname from the more esoteric. For example none of the report_ family. As it turns out, qff relies heavily on vix to do it's work. In the call graph, qff uses no functions that vix doesn't:
- comment
- myname
- quietly
- vix
- backup
- backup_one
- backup_here
- timestamp
- epoch
- trace_call
- trace_stderr
- trace_call
- trace_call
- epoch
- timestamp
- ignore
- report_notfile
- report_usage
- comment
- myname
- report_usage
- backup_here
- comment
- foreach
- cmd
- report_notcommand
- ignore
- quietly
- report_usage
- trace_call
- backup_one
- backup_log
- pptlog
- trace_call
- trace_call
- pptlog
- comment
- fix_log
- pptlog
- trace_call
- myname
- report_notargcount
- report_usage
- report_notfile
- trace_call
- vfirstline
- trace_call
- whf
- funclocn
- field
- trace_call
- function_paths
- printfirst
- trace_call
- profile
- shell_onlyfiles
- trace_call
- field
- funclocn
- backup
- whf
This suggest working on a recent idea: publish some callgraphs separately, such as any with depth in the shelflib
2.1.2 fapi – Function API bigday shdoc sfuse whfn
Fapi has gone from a one-liner to a modest application. It started with when I added stand-alone functions fun_book, to "give me the book on this function", which included fapi, the one-liner, shdoc, sfuse, and the recent whfn. I also wrote a stand-alone fun_show to support the fun_book.
Now fapi is the former fun_book. It produces four reports:
- the old fapi – it's function body and three users who call the function,
- it's shdoc – the first ':' lines in the function body
- the 'sfuse' report – the context of the function call, and
- the 'whfn' report – where is the function found, or housed.
2.1.3 shdoc – SHell DOCumentation
With the promotion of fapi, shdoc takes added significance. Today's enhancment removed the need for an shd_with, folding the functionality into shdoc itself. The simplest example is on itself:
$ shdoc function shdoc_doc () { : date 2017-02-09; : this is a shell doclib "shdoc" comment; : --------------------------------------; : an shdoc comment is the first ":"-origin lines; : in the shell function, the rest being the executable.; : writes a function {name}_doc for function "name"; : two things: a. need an _each so awk resets foreach,; : and b. "function NAME_doc () {" is a sticky format.; } $
2.2 Sunday, 19th rsync
Yesterday, while looking to use the OrgMode projects feature, it
recommended rsync
among two others, I found an implementation at
University of Texas, and wrote Bill Anderson about it. This note
details a number of principles I'm incorporating in my shell library
practice.
3 March 2017
3.1 Thursday, 30th
Today, I discovered the key to my little language for shdoc. The need is to
capture the key ingredients of a function. The first ad-hoc name is date
,
supported by shd_getdate and shd_latest. What is needed then is a small piece
of syntax. Here it is: in the first colon-comment lines of the function, the
first post-colon word in any line, ending with a colon bears a property
label.
Credit OrgMode thinking for that. So, here's a sample from a current function
daily_datewordsstem () { : date 2017-03-27; : mdorg_words has a LIMITATION, "DATE" must be the LAST field : input: stemPsufOsufs.rdb needs only STEM, PSUF, and OSUFS fields : output: DateWordsStem.now : functions: column addcol mdorg_words onlyChanged : OSUFS may be retired so there is only a PSUF field, OR, : kept around as an indiction a single OSUF may be a useful copy column stem psuf < stemPsufOsufs.rdb | addcol words date | mdorg_words | column stem psuf words date | column date words stem | onlyChanged DateWordsStem.now }
In this example, then, the properties are input, output, functions
and the date
as it currently is recognized. So, a first step is to
update the shd_ functions which recognize the date and allow a
trailing colon (:). Then a function language may be developed to deal
with the shdoc comment block. See the update on May 15.
Produces ..
4 April 2017
4.1 Saturday, 1st
Early this a.m., I'd say about 5 am. the light went on: why dump purely local functions in the big bucket? As a matter of fact, I'll modify libsave to prefer and exising local mklib. So, today's starter function, mk:
mk () { : date: 2017-04-01; for lib in ./mklib $(which mklib); do [[ -f $lib ]] && { . $lib; return }; done }
An existing local mklib
is the preferred thing to make. So, here's
the latest libsave
libsave () { : ~ [file, cmdlib] function ...; : date: 2016-09-30; : date: 2016-10-01; : date: 2016-10-19; : date: 2016-10-27; : date: 2016-11-03; : date: 2016-11-25; : args: library function ... default: ./mklib ...; : date: 2017-04-01; report_notargcount 1 $# [file] function ... && return 1; local file; case $# in 1) [[ -f ./mklib ]] && file=./mklib || file=$(which cmdlib) ;; *) local file=$1; shift ;; esac; [[ -f $file ]] || { file=$(which $file) }; report_notfile $file && return 2; shd_setdate $* | tee -a $file }
As is, with way too many dates.
Now, to remove the archaic $(shift N; echo $*)
made obsolete by
the simpler:
echo ${*:{N+1}} # e.g. in lieu of $(shift 2; echo $*) # the cleaner approach is now echo ${*:3}
daily_datewordsstem () { : date 2017-03-27; : mdorg_words has a LIMITATION, "DATE" must be the LAST field : input: stemPsufOsufs.rdb needs only STEM, PSUF, and OSUFS fields : output: DateWordsStem.now : functions: column addcol mdorg_words onlyChanged : OSUFS may be retired so there is only a PSUF field, OR, : kept around as an indiction a single OSUF may be a useful copy column stem psuf < stemPsufOsufs.rdb | addcol words date | mdorg_words | column stem psuf words date | column date words stem | onlyChanged DateWordsStem.now }
In this example, then, the properties are input, output, functions
and the date
as it currently is recognized. So, a first step is to
update the shd_ functions which recognize the date and allow a
trailing colon (:). Then a function language may be developed to deal
with the shdoc comment block. Here's a start:
$ shd_trim daily_datewwords
Produces ..
4.2 Thursday, 6th
Last night I gave my paper dairy 600 words on the motivation to move some of my OrgMode to Markdown files, and how I would do it to keep track of work started in OrgMode, then picked up in Markdown without pretending to have written whole paragraphs afresh.
Here is my continuing work in Pandoc on the conversion process.
4.3 Saturday, 8th
OK, after a gut-check, I dove into mdorg_words and fixed the question of the month.
I'm patting myself on the back for this one, since it involves a few
things working together. First, my home-grown RDB
, which I've been
carrying around for 30+ years. It uses awk
, so, the function here
is a reasonable compute
script in RDB. Here's a copy of the fapi
report:
# ---- _fapi mdorg_words -- lit_collect_todaysdata () { : date: 2017-03-27; : args: file with at least STEM, and PSUF fields; : uses: column mdorg_words rd; : cloned: daily_datewordsstem; report_notfile $1 && return 1; column stem psuf < $1 | mdorg_words | column date words stem } # ---- shdoc mdorg_words -- function mdorg_words_doc () { : date: 2017-03-22; : computes WORDS and DATE rdb fields; : from STEM and PSUF input fields with awk system; : using CAT, WC, and STAT; : todo: test the split with a " " separator, the default is TAB!?; : date: 2017-04-08; } # ---- whfn mdorg_words -- # mdorg_words 415 ./mklib # ---- fuse mdorg_words -- # lit_collect_todaysdata column stem psuf < $1 | mdorg_words | column date words stem
And the code:
mdorg_words () { : date: 2017-03-22; : computes WORDS and DATE rdb fields; : from STEM and PSUF input fields with awk system; : ref: http://stackoverflow.com/questions/1960895/assigning-system-commands-output-to-variable; : uses: column compute row; : date: 2017-04-08; report_notpipe && return 1; column date stem psuf words | compute ' file = stem "." psuf wcwd = "cat " file " | wc -w"; yymd = "stat -t %y%m%d " file system ( wcwd | getline words) system ( yymd | getline date ) # we dont need the ""s in the stat return gsub(/"/,"", date) split(date,pieces," "); date = pieces[10] ' 2> .mdwords.err | row 'words !~ /^$/' }
4.4 Monday, 24th
My current project, now that taxes are paid, is the annual portfolio reevaluation. Here is the work: User Instruction
What have I learned? The big news is a work flow for maintaining shell libraries; here's the latest section: updating mklib, other libraries.
Some of the highlights:
- a local library is conveniently named mklib
- any fixes for this library and its supplier functions are in a local fixlib
- a development cycle ends when functions in fixlib are folded back in to mklib or any other supplier libraries
- generic, or functions with wider use are promoted from their mklib to a library on the user's PATH
- tools and procedures are advanced to facilitate this practice
- one fallout is the first tool to collect functions referenced in the paper.
- and a debugging technique: read_tty
What remains for the penultimate step is to capture the referenced functions in a local and searchable library. Time for a design document. First the function to collect references.
5 May 2017
5.1 Sunday, 14th
Happy Mother's Day.
The last two weeks have distracted me further. Book Club, Mens Club, excuses, excuses, …
Big problems are being solved. In the last two days, functions fun_move and fuse_lib have solved higher level management problems. Here's the fapi for the former:
$ fapi fun_move # ---- _fapi fun_move -- fixesto () { set ${1:-mklib}; report_notfile ${1:-"No Mklib"} && return 1; fun_move fixlib $1 $(functions fixlib) } # ---- shdoc fun_move -- function fun_move_doc { : date: 2017-05-12; : date: 2017-05-14; } # ---- whfn fun_move -- # fun_move 2814 ./bin/cmdlib # ---- fuse fun_move -- # fixesto fun_move fixlib $1 $(functions fixlib) $
Fun_move, note its calling sequence, move functions (3rd and following arguments) from the first function library to the second (the first two arguments). When I find duplicated functions, the second library is the bitbucket. The story is here in the notes on Library Developement
And to clean up an indivividual library, the latter, fuse_lib is great and running down the uncalled functions. What's an uncalled function? Either a function used from the command line or it's lost it's place from the development cycle. Here's its API:
$ fapi fuse_lib # ---- _fapi fuse_lib -- COMMAND LINE Function # ---- shdoc fuse_lib -- function fuse_lib_doc { : stdout: list of functions with NO function users.; : n.b. these are used from the command line, or not at all.; : writes: fuse.out -- the fuse report.; : functions having no users have one entry.; : use this to trim the un-used functions; : date: 2017-05-14; } # ---- whfn fuse_lib -- # fuse_lib 2874 ./bin/cmdlib # ---- fuse fuse_lib -- $
Note, it's identified as a COMMAND LINE function
. And it's shdoc
is sufficiently descriptive. The whfn says it lives on line 2874 of
the cmdlib.
This now raises the possibilty of moving only command-line functions to the cmdlib, while other utility functions make their home in programlib.
5.2 Monday, 15th
In which I update a note on March 30, using shd_trim to collect function tags:
$ shd_trim mdorg_words : date: 2017-03-22; : computes WORDS and DATE rdb fields; : from STEM and PSUF input fields with awk system; : ref: http://stackoverflow.com/questions/1960895/assigning-system-commands-output-to-variable; : uses: column compute row; : date: 2017-04-08; : date: 2017-05-14; $
Note the multiple date
tags. shd_setdate supplies the today's date as the last
line in the shdoc comment block:
$ shd_setdate shd_setdate () { : appends date tag to function, avoiding redundancy; : as last line among leading shdoc comments; : -----; : this uses the local function trick. trailing UNSET; : date: 2016-09-30; : date: 2016-10-01; : update: change date from comment to shd_ tag; : uses: awk declare foreach fun_allnames uniq; : args: .. function .. library ..; : stdout: function_body ...; : date: 2016-03-30; : date: 2017-05-15; function _dffx () { declare -f $1 | awk -v date=$(date +%F) ' BEGIN { code = 0 } NR < 3 || ( \ NR > 2 && !code && $1 ~ /^:$/ \ ) { print; next } !code { printf " : date: %s;\n", date code = 1 } { print } ' | uniq }; foreach _dffx $(fun_allnames ${*:-shd_setdate}); unset _dffx } $
And note how it uses it's own name as the convenient default. Since this only produces the date tag on the stanard output, it's up to the libsave function to supply the tag when restoring the function to a library.
6 June 2017
6.1 Sunday, 4th
Where've I been? Overlooking a great breakthrough: runlib.
It's mentioned in the Only Sh-bang you'll ever need. And here's
the first step to indexing, the functions mentioned in a file: fun_mention
:
$ declare -f fun_mentions fun_mentions () { report_notfile ${1:-"no file argument"} && return 1; comm -12 <(sfg .) <(tpl $1 | sed ' s/[*~]//g; s/[),:;.}"][),:;.}"]*$// ' | lc | sort -u) } $
Also coming in for mention: callStack, trace_missing
and
find_olddef
. When you have a function that is somehow missing from
your current environment, and you know you've saved it before, but
your housecleaning has inadvertently dispatched it to the bitbucket
.
Use these functions to retreive it.
$ find_olddef {function} # in this case ymd_hms ... # shows backup files where functions was defined. $ f2file { one of the files } # puts functions in .*lib/fun... $ trace_missing {function} # when run, exposes the call stack. $ run {command which said it was missing, and} ... examine the callStack report.
callStack () { : date: 2017-06-04; { isNotNull "$1" && isHelp "$1" } && { helpShow 'callStack Diagnostics regarding where the call to this function came from'; return }; local T="${T} "; local STACK=; i=${#FUNCNAME[@]}; ((--i)); printf "${T}Function call stack ( command.function() ) ...\n" 1>&2; T="${T} "; while (( $i >= 0 )); do STACK+="${T}${BASH_SOURCE[$i]}.${FUNCNAME[$i]}()\n"; T="${T} "; ((--i)); done; printf "$STACK" 1>&2 } trace_missing () { : date: 2017-06-04; report_notargcount 1 $# "function name to trace where called" && return 1; eval "$1 () { callStack \$(myname); }" } find_olddef () { : date: 2017-06-04; report_notargcount 1 $# "looking for old function definition" && return 1; find .bak -type f | xargs grep -il "$1 ()" }
6.2 Tuesday, 6th
6.2.1 The end of underscore, and other noisy characters in filenames
Today I solved the noisy character file name problem. To a large extent. I'm not sure where the residual problems may come from, but I now have the means to replace file names with what I call "link-suitable" file names.
The problem recently arose in my job as Secretary for the Men's Club. I'm all the time receiving files to post on our site, and create links in our email messages. Working in a traditional Unix (R) environment, just occasionally the command line runs afoul of spaces, apostrophes, etc.. in file name. A prior occassion occured in my last job before retirement. A Unix file name may include colons, whereas a Windows file may not. The tool here solves that problem as well.
The residual problem, then is the URI, e.g. http:// …, and it's colon.
The heart of the solution is using the tr command, with its -cs
flag to complement the characters in a file name, those not alpha-numeric
and including the period, or dot, the slash, and underscore, into a single
dash. Step one is translating the orignal name into a safe_name
. Step
two is linking the existing file to its alternate name in save_name
.
Showing the small collection from list_name
should suffice:
$ ff list_name safe_name save_name list_name () { sfg _name$ } safe_name () { : todo: figure out where the trailing - comes from?; echo $1 | tr ' ' _ | tr -cs 'A-Za-z0-9/._' - | sed 's/-$//' } save_name () { report_notfile "$1" && return 1; ${2:-ln -f} "$1" $( safe_name "$1" ) } $
This does 90% of the job. The remaining task, easily handeled in emacs, is moving the spacey names to a sub-directory. In a few cases, i've used the directory name: spacyname. e.g.
find . -type f | grep ' ' | grep -v /spacyname
yields the list which might require handling.
One last thought: I didn't wrestle long with the concept, but … let's call these requirements:
- preserve the length of the filename.
- alpha-numerics appear at identical character position in both names,
- directories are preserved, e.g. slashes,
- file suffix is preserved, e.g. ".",
- blanks are replaced with underscore
- alpha-numerics are preserved, and
- two other non-alpha-numerics are preserved: underscore and dash,
- non-preserved chareacters are replaced with a dash.
I had thought of collapsing repeated non-preserved characters, but decided it wasn't worth the work.
6.3 Saturday, 10th
There's more to say about name-saving, but here's a higher-level
problem in need of work: in any function library, starting from a
mklib
, some functions will have public scope, to be used anywhere,
while others should remain local, to do the book-keeping on some data.
What's needed then is a means of posting some functions to a public location, on the users PATH, while other functions in the same library remain local. The first idea coming to mind is a function to list the public functions. Since the library has all the functions, to extract the local, just compare the list of the library with the list to be made public.
The place to start working is a function to maintain a list of, in this case, public functions. It seems if we are going to the trouble of creating a means of maintaining a list of functions, we don't need to bind it to how the list is used, in this case, naming the public functions.
A function family to maintain the list of functions should:
- return the current list,
- add members to the list,
- delete members from the list,
- use the smart function property
- be conventionally associated with a family of functions,
- and identify it's usage
Let's imagine how these might work. The function name alone should return it's members. When the function is called with arguments, the names are either entered, if not a member, and deleted if they are.
I think the naming convention should follow this format:
$ {family}_{usage}_list
So, forexample, a function might be named rdbcmd_public_list
, for
the functions in the rdbcmdlib intended for public use.
The other idea in the back of my head is this mechanism suggests, in the case of the public lists, the functions thus identified are all collected in a single public library. And opens the possibility that some public lists are public and some are not.
So, first let's get to the list maintenance behavior.
6.4 Saturday, 24th
Here's an exercise from Info, Elisp Introduciont, Practicing Evaluation, section 2.5:
Find a file with which you are working and move towards its middle. Find its buffer name, file name, length, and your position in the file.
(buffer-name)"swdiary-2017.org" (buffer-file-name)"/Users/applemcg/Dropbox/commonplace/software/swdiary-2017.org" (point)24482 (buffer-size)24599
7 July 2017
7.1 Saturday, 15th
For lack of a better location, it's time to open a discussion on a regular set of family subfunctions. These are names, when used in a family have a consistent use. This follows the idea I opened on Saturday, 10th. First, here are a few functions to locate any function definition:
$ ff whf funclocn locallib whf () { : date 2016-11-11; : date 2016-11-13; : date: 2017-05-29; : date: 2017-07-15; awk -v patn="^${1}$" ' $1 ~ patn && $2 ~ /^[(][)]$/ && !p[FILENAME]++ { print FILENAME } ' $(funclocn 2>/dev/null) } funclocn () { : date 2016-11-13; : date: 2017-05-12; : date: 2017-07-15; ( ls $HOME/.*profile* 2> /dev/null; locallib; funclibs ) | xargs ls -i | printfirst | field 2 } locallib () { : date: 2017-07-15; find $HOME/{doc,src,Dropbox,git,stonebridge} -name locallib | ndot3 } $
The new one, locallib finds localibs, in my favorite places. For the moment, im resisting naming a function.
8 August 2017
8.1 Sunday, 6th
Over this weekend, I've embarked on a blog library. I'll not put it in the paper copy here, either as part of the dairy, or in the technology (red) sections nor the writing (blue) sections of my Commonplace Book. Rather it's here at my M C Wagon nom 'de plume
A secret for these pages: M. C. Wagon, is of course a passable anagram for McGowan. A little search tells me "Wagon" is an unlikely surname in North America, with a plurality of it's bearers being Native American. All the better. It's European roots are Dutch, from the 16th century when they dominated trade.
The task before me is to develop a "tag" feature, where any page may have an arbitrary (zero-included) number of tags. Key to this feature is the ability to re-direct a page rendered at one address to it's fundamental page. It may be stressing the point, but if one could guarantee the property of the Unix (R) link between files, you wouldn't need a URL redirection. These functions from the blog family show the redirection implemenatation and context.
MC_Wagon.$ ff blog_{home,r_url,template_redirect} bt_thruhead blog_home () { : mfg: blog_init; : date: 2017-08-06; ${@:-echo} /Users/applemcg/Dropbox/marty3/MC_Wagon } blog_r_url () { trace_call $*; printf "<meta http-equiv=\"refresh\" content=\"=0; url=%s\"\n" $1 } blog_template_redirect () { bt_thruhead blog_r_url $1 } bt_thruhead () { trace_call $*; tag_pair HTML $( tag_pair HEAD $( tag_pair TITLE M_C_Wagons Blog; $* ) ) } $ blog_template_redirect ./2017/08/05/someblog.html > ./src/someblog.html
The latter command serves to re-direct the browser on visiting
.src/someblog.html
to it's content at ./2017/08/05/someblog.html
, where
on real file system, this command achieves the result:
ln -f ./2017/08/05/someblog.html ./src/someblog.html
where the latter file is arranged in the file system to point to the identical data set, the i-node in Unix-speak.
To me, it's bowing to accept the lowest common denominator, not the best we have to offer. If FTP were "link-preserving" and no file-system could be admitted as a data server without the property, it would be so much better. – rant over.
9 September 2017
9.1 Wednesday, 6th clf fuse
A couple of quick functions today. isclf returns true for a function not used by any other function; the implication it's a Command Line Function. Either that or one which should be retired.
clfnames () { function _clfnames () { isclf $1 && echo $1 }; foreach _clfnames ${*:-$(myname)}; unset _clfnames } isclf () { : isclf, Command Line Function, returns TRUE if so; set -- $(args_uniq $(fuse ${1:-$(myname)} | field 1)); return $# }
Notice isclf uses a bit of a trick. If there are no instances of the function being found (via fuse), it returns 0, which is shell for true.
clfnames accepts any number of function names, and lists those on the stdout which are alleged command line functions.
9.2 Thursday, 7th backup version
My current backup system, being local, has a problem with a project
developed in parallel directories. At the moment, the blogging
project, has backup files in four or five directories, including the
root, and sub-directories src, data, and lib. The backup tool
pushed backups down a .bak/
stack, the newest on top, the oldest
will be some levels down: .bak/.bak/.bak/ ...
Until today, each subdirectory had it's own .ver/
directory, with
it's own date stamps. So, this leaves two problems. You can't
guarantee a similar date stamp, and more importantly, the version
date stamp needs to apply to the tree from the top, not the bottom.
These addtions fix both problems. Here's a example of the way the
versioned files now appear.
.ver/201709_Sep/07_Thu/214556/src/eclipse_Expedition_note02.md ... .ver/201709_Sep/07_Thu/214556/bin/locallib
And the way they used to appear:
src/.ver/201709_Sep/07_Thu/085811/eclipse_Expedition_note02.md ... bin/.ver/201709_Sep/07_Thu/085811/locallib
The point being that you can now find a consistent versions of all
files from a common point in the tree. In this latter case, the
two time-stamps, to the minute are coincidentally the same. The first
set ... date
in the function guarantees all time-stamps the same.
With a little testing and patience, it's possible the copy of allfiles_version can replace the current backup_version which prodced the latter versioned file.
allfiles_backup finds all backup files, and eliminates all but the latest backup. To avoid unbacked files, backup_sync is called from backup_alldirs, the first step at the top of the for loop. allfiles_backup is called again from allfiles_version to supply the list of files to collect in the new version. the function produces a list of files in this format:
./src/.bak/.bak/.bak/eclipse_Expedition_note02.md ./src/.bak/.bak/eclipse_Expedition_note02.md ./src/.bak/eclipse_Expedition_note02.md ... ./bin/.bak/locallib
Note the first file has three backup files; the first being the
oldest, the single .bak/
being the searched-for and newest. The awk
script plucks out the .bak/ after the version is inserted at the front
of the name, producing the first result above.
backup_alldirs is used to guarantee the versioned directories are
created before the backup files are linked (ln -f
) to the versioned
files. The ! printed[$1]++
is an awk idiom I frequently use to
do what it says: if not printed (i'm counting), then print it.
This approach demonstrates the vastly underappreciated value of the
UNIX (R) link command. The backup files may eventually be pushed
down the backup stack. One presumes the versioned files are never
removed. The link is preserved to the backup, so long as the backup
file is moved (mv
) rather than copied (cp
); moving preserves
modification time, and the all-imporatant inode
of the file, it's
"disk" address.
allfiles_version () { : backup to DATE stamp, or first argument; : date 2017-03-05; set ${*:-$(date +%Y%m_%b/%d_%a/%H%M%S)}; set $(needir .ver/$1) trace_call $*; for d in $(quietly backup_alldirs) do mkdir -p $1/$d done allfiles_backup | awk -v ver=$1/ ' { dest=$1 sub(/^/,ver,dest); # prepend the version sub(/\.bak\//,"",dest) # remove the .bak/ printf "ln -f %s\t%s\n", $1, dest } ' | sh -x find $1 -type f } allfiles_backup () { find . -type f | grep '\.bak/' | grep -v '\.bak/\.bak/' } backup_alldirs () { for d in $( backup_allfiles | awk -F/ ' $1 !~ /\.bak/ && ! printed[$1]++ { print $1 } '); do indir $d backup_sync echo $d done } fix_init () { echo allfiles_version } fix_init 1>&2
9.3 Saturday, 16th awk motleyfool
Taking a break from the routine, here's an example of how I'm handling non-trivial awk scripts. The awk_file and aff functions, the foolook application, and the associated awk script. The leading comment in foolook, describes how I collect the information. The resulting CSV is then imported to a page in a spreadsheet where the Motley Fool's risk and status may be used by vlookup's on other pages to aggregate other data sources for a symbol.
awk_file () { : use NEWEST file for argument, regarless of SUFFIX; : date: 2017-05-11; : date: 2017-07-18; trace_call $*; set -- $(awk_files | grep ${1:-$(myname 2)}); case $# in 0) return ;; *) [[ -f $1 ]] && { set $(ls -t $*); echo $1 } ;; esac } aff () { : date: 2017-06-14; : date: 2017-07-18; set ${1:-aff}; declare -f $1; set -- $(quietly awk_files | grep $1); [[ $# -gt 0 ]] || return; cat $1; echo "# $* " } foolook () { : ----------------------------------------------------------------- -- : from the Motley Fools recommendation page to a CSV -- : with date, symbol, risk, status, and name -- : suitable for spreadsheet import -- : ----------------------------------------------------------------- -- : set ${1:-foolrecommendation.txt} cat $1 | awk -F'\t' -f $(awk_file) | tee ${1%.txt}.csv } BEGIN { OFS = "," lp = 2; # trailing space, a constant fields = 12 # standard record has } { status = ".." name = $2 symbol = $3 risk = $6 lsym = length(name) lr = 0 } $2 ~ /Hold *$/ { status = "hold"; lr = 4+lp } $2 ~ /New! *$/ { status = "new"; lr = 3+lp } $2 ~ /Now *$/ { status = "now"; lr = 3+lp } $2 ~ /Starter *$/ { status = "starter"; lr = 7+lp } NF == fields && !printed[symbol]++ && $1 !~ /^$/ { # trim any status from the name name = substr( name, 1, lsym-lr) # split( $1, dmy, "/") # printf "%s%s%s\t%s\t%s\t%s\t%s\n", substr(dmy[3],1,4), \ # dmy[1], dmy[2], symbol, risk, status, name # printf "%s\t%s\t%s\t%s\t%s\n", $1, symbol, risk, status, name printf "%s,%s,%s,%s,%s\n", $1, symbol, risk, status, name } # ./lib/foolook.awk ./lib/foolook.awk
The aff function displays the function, followed by an associated awk script.
I split the non-trivial awk script out for at least two reasons: it's easier to maintain as a separate script, and the other side of the same coin, I don't have to reload a function library at each edit.
So, the foolook function becomes the app-lication. It arranges the input and output of the awk script.
10 October 2017
10.1 Thursday, 12th
Today, a quick little function to deliver the S&P 500.
It seems the freely available market data providers, Yahoo and Google, no longer provide the market indicies through their quoting API's. Somewhere in 2011, '12 they and Dow Jones decided to pull those pseudo-symbol. They are still available on from googlefinance, as .DJI, .INX, and .IXIC for the Dow-Jones, S&P500, and the Nasdaq, respectively.
So, to use the suggestion in the on-line references, "use an ETF" as a surrogate, I've these two functions which do the job:
quot_stok () { trace_call $*; curl -s "http://download.finance.yahoo.com/d/quotes.csv?s=$1&f=l1" | fmdos } quot_sandp () { awk "BEGIN { print ${SANDP_FACTOR:-13.24} * sqrt( $(quot_stok IVW) * $(quot_stok SPY)) }" }
I've had quot_stok lying around for quite some time. Today's addition takes the ETF suggestion to heart: multiply the geometric mean of the two popular S&P ETF's by a factor to approximate the S&P. I can update the SANDP_FACTOR from a caluculation on the spreadsheet where I track this all. At the moment, it's 13.2391, which is an 0.007% difference from the default.
Why bother at all? I'm dropping the large portion of my daily S&P tracking from google docs, and moving to a desktop application. Which I'll report on when it's set. Basically, I'll convert some constants and the model in a spreadsheet to an awk program. Yesterday, I discovered I only need to save five moving averages, the two- thru 162-day averages, in factors of 3, the score, and a w, weighting factor for consecutive days of score saturation. To be precise, there are 9 model constants, which should also be saved.
My direction here is to add a model anticipating a market downturn, and reduce my exposure.
In terms of workload, rather than do daily bookwork, I'd like to cut back to weekly, while collecting these S&P-based calculations to a daily cron job.
I may look in on the investments daily, and maybe make a trade on any given day; My preference would be to make trade decisions on a weekly basis.
11 November 2017
12 December 2017
12.1 Sunday, 24th
It's been a quiet few months in Lake Wobegon. But that's another story. The news today is the growth of rdlib, and for the need here, the function libAsTxt, which probably goes in my once-a-day startup:
libAsTxt () { : date: 2017-12-24; for f in *lib; do ln -f $f ../lib/$f.txt; done }
The need of the moment is a local_init function, to avoid the collision which occurs on the function name in all the various places.
local_init () { : date: 2017-12-24; set $(lc $(basename $PWD))_init; isfunction $1 || { comment NO Function: $1. needed here; return 1 }; $1 }