Marty's 2016 Software Dairy
Table of Contents
1 2016
1.1 January 2016
1.1.1 Sunday, 24th
1.1.2 DONE link to diary for this date
Whew.
The tussle is now taking place on my paper and online diaries. To the neglect of this note. But, have no fear, a lot has been happening. Consult the online entry for this date.
pandoc is the great find of the last two months. My first application is harvesting sister Meg's great harvest of Dad's work on Ancestry, and reaping the harvest here.
1.2 February 2016
1.2.1 Wednesday, 17th
We now have a simple *SH*ell function *DOC*mentation aid: shdoc, now part of the Rationale for the Shell Library-Function Standard. I wrote it as part of my work to collect Meg's Ancestry entries, which I've parked in dropbox history/dev. I've factored it into its own shell function library. It should be moved into a general function management library.
1.3 March 2016
1.4 April 2016
1.4.1 Friday, 22nd
I've got a chasing-my-tail problem. When it's time to fix a function, I find a lower level that needs an update, and so forth. Where does it end?
I've added a few functions to track down where a function might be defined in a tangled Org file. So far so good:
- whf indentifies where a function is defined in any funclocn
- ihf selects an abbreviated list, given redundant linked (ln) files
- ohf uses ihf to identify those written by tangle ing an Org file; there should be only one, returning nothing for those functions not defined in an Org file.
- fohf looks thru the org -generated file for the the particlar Org file which defined the function
- qff opens the defining source library for edting at the function definition, backs up the library and sources the library, making the new definition immediately available.
Consequently
- ihf should be an upward-compatible with whf, i.e. a replacement,
- ohf is a suitable test for the org-defined function
- A super-function is possible to accomodate these two modes of function definition except, the updated Org mode needs tangling to make the source capable of reading into the current shell.
- therefore, for the time being, I'm editing a copy of the function in a local fixlib. So, when things settle down – no more tail-chasing – it is then time to run thru the list of functions accumulated in a batch, returning some to their Org file, and the remainder to their non-Org library. (which is what whf is for).
Now, fixtohome is a useful function to help clean up any mess:
fixtohome {some}lib
restores a number of common functions repaired in fixlib to a local somelib.
The current challenge when generating an app from library fuctions is to replace trace_{stderr,call}, with a non-logging version.
trace_simple # produces a trace_call () which calls trace_just
Here are the functions:
trace_call () { trace_stderr "$@" } trace_simple () { eval "trace_call () { trace_just \"\$@\" 1>&2; }" } trace_just () { pa=${FUNCNAME[2]:-COMMANDLINE}; gr=${FUNCNAME[3]:-COMMANDLINE}; printf "TRACE %s\t@ %s %d ( %s )\n" "$pa" "$gr" $# "$*" } trace_stderr () { pa=${FUNCNAME[2]:-COMMANDLINE}; gr=${FUNCNAME[3]:-COMMANDLINE}; printf "TRACE %s\t@ %s %d ( %s )\n" "$pa" "$gr" $# "$*" | logDateOnceHourly ~/lib/trace.log 1>&2 }
1.4.2 Saturday, 23rd
The solution to yesterday's problem is to keep a STACK of Frontier projects.
1.4.3 Monday, 25th
It's time to highlight the OM family of functions:
om_code om_frequent om_generic om_help om_iam om_init om_list om_m om_more om_o om_rdb om_tabs om_tally
The user's entry point is om_iam
, which insists on being called from
a user's {name}_init function.
The user's documentation is in om, object and methods, in the auxlib.
om_iam creates two functions, name and name_help, the latter if it
doesn't exist at the initial call. The default behavior of name calls
name_help. Or, functions created in this manner may be called with a
sub-function as the first argument. i.e. om_list
may be called:
om list
I find it easier to type a space than the underscore. When coding, it's probably a good idea to use the underscore, since may function maintanece tools will not treat the special call as a function name. There might be a reason to keep some code obscured from the funciton-discovery process, but I can't think of it now.
Importantly om_iam
replaces what I'd previously called
main_function, now obsolescent. I've planted a forward reference
*where that was introduced, Saturday, Nov 7th, 2015
1.4.4 Tuesday, 26th
*Today, I solved the problem of linking to specfic dates. For example, see yesterday.
1.5 May 2016
1.5.1 Wednesday, 11th
1.5.2 Saturday, 21st
Today, the Holy Grail of making with the shell. Two general-pupose functions: *new_fromby and new_oldsuf, a class-defining function: org_one2ht and an application org_any2ht. First the general_purpose:
new_oldsuf () { eval echo ${1%*.*}.{$2} } new_fromby () { : ~ file newSuf,Oldsuf Command, where; : Command file.OLDFUF ... is smart enough to write on file.NEWSUF,; : either explicitly, -o or STDOUT or implicitly, in wrapped command; report_notargcount 3 $# file new,old command && return 1; set $(new_oldsuf $1 $2) $3; report_notfile $2 && return 2; report_notfunction $3 && return 3; newest $1 $2 && return; trace_call $*; $3 $2 }
The _oldsuf function was the breakthrough. The two arguments are a file handle and pair of suffixes a,b such that file.a is produced from file.b. These would set up a C compiler with this example:
new_oldsuf main o,c
The function uses the bash shell expansion
echo main.{o,c}
to produce the pair of names for the newest dependency evaluation. It is remarkable in its ability to profuce the necessary results since the following calls are all equivalent:
new_oldsuf main o,c new_oldsuf main.c o,c new_oldsuf main.o o,c new_oldsuf main.err o,c
while this is not:
new_oldsuf main.* o,c
This might suggest the arguments be reversed, if only to lift the restriction.
The _fromby function spends most of its time checking assertions:
- three arguments,
- the second argument e.g. main.c must exist, the other need not.
- the third argument is a function.
The degenerate version of the function is simply:
set $(new_oldsuf $1 $2) $3; newest $1 $2 || $3 $2;
And the application org_any2ht around the class-defining function: org_one2ht
org_any2ht () { trace_call $*; new_fromby $1 html,org org_one2ht } org_one2ht () { report_notfile $1 && return 1; trace_call $*; $HOME/bEmacs $1 -f org-html-export-to-html --kill }
Since bEmacs is the handle to the emacs –batch process on my iMac:
exec /Applications/Emacs.app/Contents/MacOS/Emacs --batch $@
1.5.3 Monday, 23rd
Here's the updated functions. The user interface is org_any2ht.
Pay attention to the use of suf_newold. It's different from yesterday's design, and motivated by the call from org_any2ht, since _foreachij, like foreach, takes a command argument, followed by a pair of repeated arguments.
foreachij cmmd arg1 arg2 argA argB argC ... # is equivalent to cmmd arg1 arg2 argA cmmd arg1 arg2 argB cmmd arg1 arg2 argC cmmd arg1 arg2 ...
This allows iteration on the optional arguments, file.suf … A class
of file is converted from an org
suffix into an html
file. So,
now inspect the set $(suf_newold .. )
call in new_fromby. The
third argument, repeated by the call from shell_foreachij
is each of
the successive file handles.
A little obtuse, but greatly simplifies the user interface. It's especially productive in scanning a list of similar files, a file class.
suf_newold () { : ~ suf,pair file{.suf}; : NEED to validate ONE as a SUF,PAIR; eval echo ${2%.*}.{$1} } org_any2ht () { trace_call $*; shell_foreachij new_fromby html,org org_oneToht $* } new_fromby () { : ~ newSUF,OldSUF COMMAND FILE, where; : Command file.OLDFUF ... is smart enough to write on file.NEWSUF,; : either explicitly, -o or STDOUT or implicitly, in wrapped command; report_notargcount 3 $# new,old command FILE && return 1; : prior to SET: new,old COMMAND, file, becomes, set $(suf_newold $1 $3) $2; : file.new file.OLD COMMAND; report_notfile $2 && return 2; report_notfunction $3 && return 3; newest $1 $2 && return; trace_call $*; $3 $2 } org_oneToht () { report_notfile $1 && return 1; trace_call $*; doit $HOME/bEmacs $1 -f org-html-export-to-html --kill }
1.6 June 2016
1.6.1 Tuesday, 7th
Over the weekend, my 50th class Reunion at MIT, I discovered a pair of functions to add to the flow-control section, formerly only consisting of the foreach collection: foreach, foreachi, and foreachij.
The new user function is if_elseif. Here it is with a test harness.
echo3 () { echo $1 $2 $3 } echo5 () { echo $1 $2 $3 $4 $5 } if_aorb () { dotru=$1; shift; dofals=$1; shift; bool="$1"; shift; $(if_elseif "$bool" "$dotru" "$dofals") $* } if_elseif () { : ~ condition {true result} {false result, or ~ ...} : returns first true result case $# in 3) test $1 && echo $2 || echo $3 ;; *) test $1 && echo $2 || { shift 2; "$@" } ;; esac }
The function if_elseif accepts any number of else-if conditions. The minimum argument count is two.
The simple case, with three arugments returns the second argument when the first is true otherwise the third argument.
If there are more than three aguments, then the test and result on the first arugment is the same, otherwise those two are shifted out and the process is repeated. Notice it's not necessary for the third arugument to be another if_elseif in this case, but it makes for a natural semantic expression for a multi-way if … else-fi … expression.
And the degenerate case, only two arguments (test #7) behaves as expected.
So, for the moment, I'll be putting this to work in new development. The behaviour of complex expressions is the thing to look out for.
A first insight. Control flow, like function return statments will require their own wrapper. For instance
something ... && { ...; return; }
is hard to manage in this model.
The test cases demonstrage the currenly know possible applications.
if_testA () { echo one two three; if_aorb echo3 echo5 '6 -gt 5' one two three four five six 7 eight; echo one two three four five; if_aorb echo3 echo5 '3 -gt 5' one two three four five six 7 eight; local fcond="3 -gt 5"; local tcond="6 -gt 5"; echo ==============; echo "True 1 $tcond"; if_elseif "$tcond" "True 1 $tcond" \ if_elseif "$fcond" "False 1 $fcond"; echo ==============; echo "False 2 $fcond"; if_elseif "$fcond" "True 2 $tcond" "False 2 $fcond"; echo ==============; local answer="T 3 $tcond"; local nonans="Fa 3 $fcond"; local nonan2="Fb 3 $fcond"; echo $answer; if_elseif "$tcond" "$answer" \ if_elseif "$fcond" "$nonans" \ if_elseif "$fcond" "$nonan2"; echo ==============; local nonans="T 4 $tcond"; local answer="fT 4 $fcond"; local nonan2="Fb 4 $fcond"; echo $answer; if_elseif "$fcond" "$nonans" \ if_elseif "$tcond" "$answer" \ if_elseif "$fcond" "$nonan2"; echo ==============; local nonans="T 5 $tcond"; local nonan2="fT 5 $fcond"; local answer="FF 5 $fcond"; echo $answer; if_elseif "$fcond" "$nonans" \ if_elseif "$fcond" "$nonan2" "$answer"; echo ==============; local answer="true 6 $fcond"; echo $answer; if_elseif "$tcond" "$answer"; echo ==============; local answer="false 7 $fcond"; echo the ANSWER: $answer is EMPTY; if_elseif "$fcond" "$answer"; echo ============== }
And the result:
one two three one two three one two three four five one two three four five ============== True 1 6 -gt 5 True 1 6 -gt 5 ============== False 2 3 -gt 5 False 2 3 -gt 5 ============== T 3 6 -gt 5 T 3 6 -gt 5 ============== fT 4 3 -gt 5 fT 4 3 -gt 5 ============== FF 5 3 -gt 5 FF 5 3 -gt 5 ============== true 6 3 -gt 5 true 6 3 -gt 5 ============== the ANSWER: false 7 3 -gt 5 is EMPTY ==============
1.6.2 Friday, 10th
*Discoveries are singular moments. Today, a solution to the function-collection process to make an application collection I discovered fun_level which may be recursively called to produce the functions used by a function or funcitons. It's used:
fun_level fun_level ... echo FUNCTION ...
Here is it's code, with it's first few top function calls:
fun_level () { : ~ GENERATOR { function ... } : called repeatedly returns functions at next level foreach fun_uses $($*) | grep -v ^trace_ | sort -u } fun_uses () { trace_call $*; foreach type_me $(foreach fun_extract $* | sort -u) | awk ' $1 ~ /^function$/ {print $2} ' } fun_extract () { : ~ function ... : returns list of functions called by FUNCTION trace_call $*; declare -f $* | awk '$1 !~ /^:$/' | tr '(){};,/*$' ' ' | tpl | sort -u | grep '^[a-zA-Z0-9_]' } tpl () { : ~ file ... or STDIN : separates input into successive blank-separated Token Per Line trace_call $*; cat ${*:--} | tr -s ' \t' '\n' } trace_simple () { : ~ arg ... : prints Calling Function and passed arguments to STDERR pa=${FUNCNAME[2]:-COMMANDLINE}; printf "TRACE %s\t%s\n" "$pa" "$*" 1>&2 } trace_show () { : show a basic, simple trace_call; : returns trace_simple as trace_call declare -f trace_simple | sed 's/simple/call/' }
Here is the usage to produce the applib-called functions, starting with app_{fun,init} and fun_level:
fun_level fun_level fun_level fun_level \ fun_level echo app_{fun,init} fun_level | tee appfun.app | wc
And the list of functions recovered in appfun.app. The point is that the three functions app_fun, app_init, and fun_level call only functions in this list, with the sole exception of trace_call, whose suitable replacement is given by trace_show.
UC app_fun app_init app_trace app_uses cmd comment for_nomar foreach fun_extract fun_from fun_level fun_uses functions ignore isfunction myname needfrom om_generic om_iam printfirst qmoved report_notcalledby report_notcommand report_notfile report_notfunction report_usage sfg source tpl type_me wpl
Given the file appfun.app, the listof those functions, then this command produces a complete application, all the functions needed to run any of the three function entry points.
$ (declare -f $(< appfun.app); trace_show; fun_starter app) | tee .l
In summary, today's function breakthrough (an FOTD) is fun_level, which may call itself recursively, albiet from the command line.
1.6.3 Saturday, 11th
*To summarize yesterday, the command lines to capture all the functions for an application:
fun_level fun_level fun_level fun_level \ fun_level echo app_{fun,init} fun_level | tee appfun.app | wc $ (declare -f $(< appfun.app); trace_show; fun_starter app) | tee .l
For an untested demonstration, wrap that in a function:
fun_alllev () { local r=${%_*}; local a=${r}fun.txt local l=${r}app : ${FUN_LEVEL:=5} # raise, lower, until successive # use returns same WC ....... vv $(repeated 'fun_level ${FUN_LEVEL}' echo $* | tee $a | wc ( declare -f $(< $a) # all function bodies trace_show # the trace replacement fun_starter $r # the initialization call ) > $l
1.7 July 2016
Was a Minnesota Vacation.
1.8 August 2016
1.8.1 Wednesday, 3rd
Today I've come to realize the limited value of tangling function libraries. There is value in factoring the family groups, and the notion of a utility library apart from a family group. But the value of quick and reliable update of single functions, at the "place of work", i.e. in the loadable, sourceable library, is too great to burden yourself with the bookeeping task of updating a file incurring a measurable turn-around time to put the code in production. Better find a way to collect the changes and update the documentation. Not to mention leaving a trail of crumbs to the history of the changes.
To that end, here are a few functions from today's work to advance the reliable update and change-log. I'm learning that the tension is between the easy use of the command line, and working in a text edit buffer, such as emacs.
backup_log () { pptlog $* >> ~/lib/backuplog.txt } fix_log () { pptlog $* >> ~/lib/fixlog.txt } fun_to () { f=$(which $1); shift; report_notfile $f && return 1; ff $* | tee -a $f; fix_log $f $* } vfirstline () { awk "\$1 ~ /$1/ { print NR; exit }" } vix () { case $# in 0) comment $(myname) function [ .. ] ;; 1) set $1 $(whf $1) /dev/null; report_notfile $2 && return 1; case $EDITOR in emacs) set $1 $2 $(cat $2 | vfirstline $1); report_notargcount 3 $# Function File lineNumber; ${EDITOR} -nw +$3 $2 ;; *) ${EDITOR:-vi} +/$1/ $2 ;; esac; . $2; fix_log $1 $2 ;; *) set $1 $(whf $1) /dev/null; vix $1; backup $2; backup_log $1 $2 ;; esac } fun_files () { : return list of files used in function; report_notfunction $1 && return 1; : final SORT to remove dups caused by $HOME a.o ~; trace_call $# $*; foreach namedfile $(fun_tokens $1 | grep '[a-zA-Z]') | sort -u }
The user functions here, fun_to and vix are new. Actually, I've been using vix for a few months, only in the sense of its requiring a single argument.
The new features are:
- a helper; with no arguments, vix returns a hint,
- with more than one argument, the containing library, '$(whf $1)' is backed up, and
- in either case, each fix is logged, each backup is logged.
Note, the apparently redundant set idiom cannot be factored out of the case statement since it sets at least one argument, the non-file /dev/null. Which gave me a problem when trying to emulate the +/pattern/ of the VI editor in Emacs. I still have questions about this feature, but some literature (stackoverflow) suggest you need the -nw flags to eamcs to treat the editor as a terminal window. It's not clear. Here, the vfirstline function makes that a testable feature. And awk performs nearly as well as grep
As a reminder-to-self, I've included fun_files, which for these functions:
$ quietly foreach fun_files backup_log fix_log fun_to reverseLines \ yesterday fun_files fun_tokens backup_log /Users/applemcg/lib/backuplog.txt fix_log /Users/applemcg/lib/fixlog.txt fun_to ../bin/cmdlib reverseLines yesterday fun_files fun_tokens $
returns the names of files used by a function.
The reverseLines function works like this:
$ cat history | reverseLines | pickOutLines | reverseLines | tee Save
Since the job of pickOutLines is to keep just the lines you'd like to save from your history. Presumable you've taken a peek "history" before starting the pipe. The idea behind reverseLines recognizes that we learn and build up our command line as we go: the last line of a similar sequence is the one to retain. Let's see what pickOutLines looks like.
1.8.2 Friday, 26th
Here's a new discovery: fun_discovery
, a function to discover function
references outside a library:
fun_discovery () { report_notfile $1 LIBname && return 1 set $1 $(basename $1) set $* ${2%lib} set $1 $2 ${3%.*} report_notargcount 3 $# $* -- Library .Name Root && return 1 functions $1 | grep -v _doc | sort -u > $2; callgraph_eg $(cat $2) > /dev/null; command comm -13 $2 <(field 2 < ~/tmp/callgraph.out | sort -u) | grep -v ${3}_ }
This should greatly ease the pain of rounding up external function references
from a file or library. The case in point, usbbklib, shows some careless
function nameing on my part. Look for a usbbk_doc
function which records
the BUGs elsewhere.
1.9 September 2016
- Friday, 30th
Today, I discovered *dff, put a Date on the Formated Function:
dff () { : date 2016-09-30; : for changes to functions, supply unique date; : as leading comment; : this uses the local function trick. trailing UNSET; function dffx () { declare -f $1 | awk -v date=$(date +%F) ' { print } NR == 2 { printf " : date %s;\n", date } ' | uniq }; foreach dffx ${*:-dff}; unset dffx } libsave () { : date 2016-09-30; dff ${*:-libsave} | tee -a $(which cmdlib) }
and libsave, using dff saves the dated function in cmdlib, now the home of ad-hoc, off the cuff functions.
Here are a few details to note:
- the dff function uses an idiom to create a local function. if the function dffx may only be called in its parent, then unset it when done. It (dffx) will be redefined each time the function is loaded; this is now worth the trouble, since it's a small price to pay to avoid the name-space pollution.
- The call to awk passed today's date in iso format: YYYY-MM-DA. Also note when pasted into the function body, the trailing semi-colon is added. This to respect the call to uniq, which eliminates multiple updates on the same day. Also an encouragement to get it right before you put yourself to sleep
It occurs to me now that with the dff feature, it will become possible to track creation dates of functions, and sort them in some fashion:
- creation date
- latest update
- number of updates
- functions updated within N days of one another
- …
Exciting times ahead.
This should probably go in the shd family of functions. And here's one more idea: shd_bydate, which reports functions and modification dates extracted from the dff or now shd_date.
1.10 October 2016
1.10.1 Saturday, 8th
Today's big discoveries were bigday and step_lap. The latter started out as laptimer, but since it belongs to the growing step family, that's were it lives now:
bigday () { ${*:-echo} bigday {local,awk}_{data,lib} step_{lap,lapfile,report,reset} } step_lap () { function clock_tick () { set .tick; touch $1; eval $(stat -s $1); echo $st_mtime; rm -f $1 }; clock_tick >> $(step_lapfile); unset clock_tick }
I have to round up functions in the bigday list; while they may have a proper home, a few were missing while using recorg, my ORGanizer. The "step_lap", or "laptimer" was the big breakthru.
I'm now producing YouTube Videos for Shell Functions. In order to properly rehearse the timing, I thought it useful to first have a timer to tell me amount of time I'm using while practicing; I 've promised to keep the videos to 6 minutes. And the tools I'm setting up in file:///Users/martymcgowan/talk/.h.count read a script file to prompt me for the examples.
It seems useful to record the elapsed time through each step in the script to see how i'm doing towards the goal. step_lap, using a local function clock_tick records system time in seconds for each double comma (,, or step_next) command. If you look at set_notstep and the alias in the companion ~/talk/.count.rc file set command has been co-opted to hide any functions in this file from the user display. The idea is to only show the developed functions as if the user didn't have a copy of the step family.
By the way, the {local,awk}_{data,lib} functions were born in Family/invest. I had to run them down. It's like I need a companion to libsave when files are housed in a particular applicaiton. That idea needs work.
1.10.2 Sunday, 16th
The last week has been a productive one here. At the top of the heap: shd_{s,g}etdate, and _dffx has moved in as a wholly owned subfunction of _setdate. First the functions, then a report.
shd_setdate () { : for changes to functions, supply unique date; : as last line among leading comments; : this uses the local function trick. trailing UNSET; : date 2016-09-30; : date 2016-10-01; function _dffx () { declare -f $1 | awk -v date=$(date +%F) ' BEGIN { code = 0 } NR < 3 || ( \ NR > 2 && !code && $1 ~ /^:$/ \ ) { print; next } !code { printf " : date %s;\n", date code = 1 } { print } ' | uniq }; foreach _dffx ${*:-shd_setdate}; unset _dffx } shd_getdate () { : date 2016-10-01; trace_call $*; : reconcile patterns to fuse; case $# in 0) report_notpipe && return 1 ;; esac; cat ${*:--} | awk ' $2 ~ /^\(\)$/ { fun = $1 } /^function / { fun = $2 } $1 ~ /^:$/ && $2 ~ /^date$/ { printf "%s\t%s\n", fun, $0 } ' } cmdlib () { : date 2016-09-30; which cmdlib } zero_arg_help () { set -- $* $(myname 2); case $1 in [1-9]*) return 1 ;; esac; declare -f ${2:-$(myname)} }
A big recent addition has been zero_arg_help, which arose from a thought in the Youtube series on the shell function I've recently started. The thought: shell functions which take a function argument should supply a convenient default, maybe themselves. But if they don't have a convenient default, what to do in the case of no arguments? The simple answer, display their function body. Here's a user:
$ libsave libsave () { : date 2016-09-30; : date 2016-10-01; zero_arg_help $# && return 1; shd_setdate $* | tee -a $(cmdlib) } $
And now the shd_{getdate,latest} reports; these functions are early in their life:
$ ff shd_latest; shd_latest cmdlib | sort -k2 shd_latest () { shd_getdate $* | awk '{ date[$1] = $4 }; END { for (d in date) print date[d], d; }' | sed 's/;//' } 2016-10-16 awk_file 2016-10-16 awk_lib 2016-10-16 awk_name 2016-10-12 backhere 2016-10-09 backup_files 2016-10-09 backup_version 2016-09-30 cmdlib 2016-09-30 funcabsl 2016-10-06 home 2016-10-01 libsave 2016-10-16 local_data 2016-10-16 local_lib 2016-10-06 master_fields 2016-10-02 ncal2org 2016-10-06 nonmlibs 2016-10-02 nvn 2016-10-06 rdb_timetally 2016-10-02 recorg 2016-10-06 sawk 2016-10-01 shd_getdate 2016-10-01 shd_setdate 2016-10-05 shellfunctions 2016-10-06 sql_denominator 2016-10-05 talk_list 2016-10-05 theFunction 2016-10-02 timestamp 2016-09-30 whf $
This has been the holy grail for me; explicitly recording the dates of function modification. The final challenges;
- put a function in the library detailing its contents, and their modification dates.
- another function to return modified functions to their default directories
- and do this in lieu of a single convenient default
1.10.3 Sunday, 23rd
This was the week of the stat library. It keeps a four-field table in HOME/lib, named stat.rdb. The fields: mtime, size, name, directory. My motivating feature is to see when files change, and record the changes. The rdb command: rdput records insert and delete times in the h….Z file for the companion table. So, the rdb command:
zcat h.stat.rdb.Z | row 'name ~ /thisFile/'
shows the history of "thisFile".
The stat_dirs function produces this report:
directory --------- ~/Dropbox ~/Dropbox/bin ~/Dropbox/commonplace ~/Dropbox/commonplace/astro ~/Dropbox/commonplace/letters ~/Dropbox/commonplace/rdb ~/Dropbox/commonplace/software ~/Dropbox/commonplace/software/script ~/Dropbox/commonplace/story ~/Dropbox/commonplace/sudoku ~/Dropbox/commonplace/unixshell ~/Dropbox/commonplace/version ~/bin ~/git/shelf/inc ~/lib ~/talk ~/talk/.bin ~/talk/.doc
Running stat_rdb in any directory updates it's table entries. So, with a list of these directories, one can imagine a command to routinely update the table.
Here is a typical report:
$ stat_data | column name | rd ../bin/sort | rduniq | row 'Count > 2 && name ~ /org$/' Count name ----- ---- 4 index.org 4 orgHeader.org $
Using those names, we may see:
$ stat_data | row 'name ~ /index.org/ || name ~ /orgHeader.org/' | jm mtime size name directory ----- ---- ---- --------- 1476808028 4125 index.org ~/Dropbox/commonplace/astro 1476807626 69 index.org ~/talk/.doc 1475254385 155 orgHeader.org ~/Dropbox/commonplace/software/script 1475254385 155 orgHeader.org ~/Dropbox/commonplace/software 1475254385 155 orgHeader.org ~/Dropbox 1473090273 6490 index.org ~/Dropbox/commonplace 1472995298 155 orgHeader.org ~/Dropbox/commonplace 1442697075 1058 index.org ~/Dropbox/commonplace/unixshell $
1.10.4 Thursday, 27th
I've also solved the functions problem, but seem to have misplaced the good result. It now works with:
- a named file
- standard input, or
- lacking either of those, the internal functions
1.11 November 2016
1.11.1 Thursday, 10th
*As the basis of capturing some command history, yesterday, in the aftermath of a horrid election result, I came to "capture_some", here with an instance of it's use:
capture_some () { local fun=capture_$1; echo "$fun () { "; quietly ngh $* | quietly nhn | uniq | sed 's/^/ /'; echo "}" } capture_fmcsv () { ls fixlib functions fixlib cat fixlib backup fixlib sfg fmcsv whf fmcsv_init . fmcsvlib fmcsv_hdrTbl export.csv | tee members.nxt fmcsv_hdrTbl csv/export.csv | tee members.nxt sfg fmcsv_ sfg fmcsv_hdrTbl ff fmcsv_hdrTbl fmcsv_hdrTbl csv/export.csv | tee members.rdb | jm fmcsv_hdrTbl csv/export.csv | tee members.rdb | wc fmcsv_hdrTbl csv/export.csv | tee members.rdb | jm libsave ../fixlib sinceDec2015 memberID duplicateID memberStatus libsave ../fixlib memberReport cat ../fixlib | shd_getdate functions fixlib quietlyforeach do_whf $(functions fixlib) quietly foreach do_whf $(functions fixlib) quietly foreach do_whf $(functions fixlib) | grep fmcsvlib set $(quietly foreach do_whf $(functions fixlib) | grep fmcsvlib|field 1) libsave fmcsvlib $* set $(quietly foreach do_whf $(functions fixlib) | grep -v fmcsvlib|field 1) quietly foreach do_whf $(functions fixlib) | tee .fixlib.whf grep -v fcsvlib .fixlib.whf grep -v fmcsvlib .fixlib.whf ff $(grep -v fmcsvlib .fixlib.whf) | tee fixlib grep -v fmcsvlib .fixlib.whf grep -v fmcsvlib .fixlib.whf| grep -f fmcsv_ grep -v fmcsvlib .fixlib.whf| grep -v fmcsv_ ff $(grep -v fmcsvlib .fixlib.whf| grep -v fmcsv_) | tee fixlib captures_some capture_these fmcsv fixlib capture_some capture_these fmcsv fixlib capture_some fmcsv fixlib capture_some fmcsv fixlib | tee -a ~/tmp/.x }
Notice the last few lines, a usage guide. What remains is to edit the
file, in this case HOME/tmp/.x
to trim the redundancies and rename
the function.
1.11.2 Sunday, 13th
I've a few post-election discoveries to share:
1.12 December 2016
1.12.1 Saturday, 3rd,
Here are four important functions from the last few days: lib_crunch, isfunction, spaceit and fuse.
spaceit () { : anagram of letters in set, pipe, cat; : date 2016-10-27; : date 2016-11-03; : date 2016-11-12; : defaults,; : NO arguments, if PIPE, then CAT, else SET; : and argumens, the FILES; : date 2016-12-03; function ispipe () { [[ -p /dev/stdin ]] }; case $# in 0) ispipe && cat || set ;; *) cat $* ;; esac } fuse () { : date 2016-11-10; : date 2016-12-03; : add the equals sign, double quotes to lead; : date 2016-12-03; report_notargcount 1 $# pattern && return 1; : write the awk script; function _find_pattern () { awk -v p="$1" ' BEGIN { lead = "[^/.a-zA-Z0-9=\"_-]" pata = lead p lead patb = lead p "$" pcts = "%s" print "$1 ~ /^:$/ || $1 ~ /^comment$/ { next }" print "$2 ~ /^\\(\\)$/ { f = $1 }" printf "/%s|%s/ { printf \"%s\\t%s\\n\", f, $0 }\n", pata, patb, pcts, pcts exit } ' }; p="$1"; shift; spaceit $* | awk -f <(_find_pattern $p); unset _find_pattern } isfunction () { : ~ function; : returns TRUTH of argument as FUNCTION; : date 2016-12-03; case $1 in [a-zA-Z_][a-zA-Z0-9_]*) declare -f $1 > /dev/null ;; *) return 1 ;; esac } lib_crunch () { : date 2016-12-03; trace_call $*; report_notfile ${1:-MissingArgument} && return 1; backup_lib $1; . $1; ( fbdy $(functions $1); fun_starter $1 ) > .l; mv .l $1; backup_lib $1 }
The spaceit function adds some interesting, and reflecting, ever more important features to a function-oriented set of tasks: the default input stream is the current environment, the result of using the set command.
For example, it renders the sfuse function obsolete, since now, fuse all by itself, safely assumes the "set" process.
1.12.2 Wednesday, 7th
*A couple of "cuties" are new today, gx and nar for GREP XARGS and NOT ARCHIVE nor BACKUP.
gx () { : date 2016-12-07; xargs grep -l "$@" } nar () { : date 2016-12-07; egrep -v '/(ar|backup)/' $* }
A simple use case:
$ htmlfiles | nar | gx img
lists HTML files with an img tag.
1.12.3 Sunday, 18th
Let's do some design right here. The problem I've noted – and not shared here – is the question: How to have a local, internal library within a library? Let's explain. In the shell, when a library has been sourced, all the functions defined in the library according to the function syntax:
functionname () { ... ; } function functionname { ... ; } function functionname () { ... ; }
expose functionname globally.
Occasionally, a function may want entirely local sub-functions only visible within itself. Here's my template for such behaviour:
recorg () { function forg () { find ${*-.} -name '*.org' }; function recorgx () { forg ../talk; forg $(ls | grep -v ' ') ../{doc,stonebridge,git} | nvn }; function recorgy () { ... }; function recorgt () { ... }; ... recorgx | ... recorgt $1 | recorgy > $3; rdput $3; unset recorg{t,x,y} forg; ... }
The four internal functions are defined only when recorg is executed, and, as the function is used, late in its execution, if not the last statement, the local functions are unset, or undefined. Thus, the recorg name is visible, but none of the four subfunctions.
The problem now arises when two or more globally named functions want access to similar functions or facilities. Before getting carried away, the caveat arises around the definition of "or more". Since, in a library familiy with a few dozen functions resumbably more than one are globally useful. The solution I'm about to propose says if there are say, two or three global functions looking for some common code, then this makes sense. But if the number of client functions grows much larger, say to the range of seven, sort of a magic number, then these internal library functions should simply be globally available themselves.
At the risk of getting carried away, this is likely to be every bit as important as my family concept.
Here's the outline of an internal library, it's definition and it's use.
function_library () { : this is a global function, but only these functions may call it report_notcalledby function_{a,b,..} : following functions are purely local to this library function lib_a () { : first local lib ...; } function lib_b () { : more local lib ...; } ... the lone executable defaults to a comment. Needs testing ${*:-:} lib_{a,b,...} } function_a () { function_a1 () { : single useage as before; } ... : loads the library function_library ... lib_a ... lib_b ... : and when done with the function library function_library unset }
The assertion worked, and in a simulated test:
$ a () { ${*:-:} x y z; } $ a $ a echo x y z $ set -x; a + a + : x y z fun_aa () { fun_xx; ff xx_a; xx_a; fun_xx unset; } fun_xx () { function xx_a { myname 2; }; ${*:-:} xx_a; } fun_aa set +x fun_aa gh ff xx_a fun_aa
Where ff, gh, and myname are my own utilities.
That's enough for tonight.