From 1079f927a27db9d194c8e25eb3a188396fdf8eab Mon Sep 17 00:00:00 2001 From: Paul Buetow Date: Mon, 17 May 2021 21:02:55 +0100 Subject: refactor code --- TODO.md | 6 - buetow.org.sh | 14 +- content/gemtext/gemfeed/atom.xml | 8 +- ...alistic-load-testing-with-ioriot-for-linux.html | 4 +- .../2021-04-24-welcome-to-the-geminispace.html | 2 +- content/html/gemfeed/atom.xml | 8 +- ...realistic-load-testing-with-ioriot-for-linux.md | 4 +- .../2021-04-24-welcome-to-the-geminispace.md | 2 +- modules/assert.source.sh | 30 ----- modules/atomfeed.source.sh | 118 ----------------- modules/gemfeed.source.sh | 51 ------- modules/generate.source.sh | 128 ------------------ modules/html.source.sh | 146 --------------------- modules/log.source.sh | 26 ---- modules/md.source.sh | 55 -------- packages/assert.source.sh | 30 +++++ packages/atomfeed.source.sh | 118 +++++++++++++++++ packages/gemfeed.source.sh | 51 +++++++ packages/generate.source.sh | 128 ++++++++++++++++++ packages/html.source.sh | 145 ++++++++++++++++++++ packages/log.source.sh | 26 ++++ packages/md.source.sh | 55 ++++++++ 22 files changed, 574 insertions(+), 581 deletions(-) delete mode 100644 modules/assert.source.sh delete mode 100644 modules/atomfeed.source.sh delete mode 100644 modules/gemfeed.source.sh delete mode 100644 modules/generate.source.sh delete mode 100644 modules/html.source.sh delete mode 100644 modules/log.source.sh delete mode 100644 modules/md.source.sh create mode 100644 packages/assert.source.sh create mode 100644 packages/atomfeed.source.sh create mode 100644 packages/gemfeed.source.sh create mode 100644 packages/generate.source.sh create mode 100644 packages/html.source.sh create mode 100644 packages/log.source.sh create mode 100644 packages/md.source.sh diff --git a/TODO.md b/TODO.md index 3562ffe1..b75b26a8 100644 --- a/TODO.md +++ b/TODO.md @@ -4,10 +4,4 @@ Adjust code to reflect the google style guide. Use this to practice navigating t * comment complex functions * comment all lib functions -* TODOs always with a name in it, e.g. "TODO(paul): BLabla" -* fix signle vs double quotes: strings without var interpolation -* [[ ]] is preferred over [ ] -* bash -c 'help readarray' trick, zsh alias/function for bash help -* avoid a stand alone (( i++ )) -* rename ./modules to ./packages * buetow.org.conf: declare -xr FOO=bar both constant and env. diff --git a/buetow.org.sh b/buetow.org.sh index 21e19352..fb5f45f8 100755 --- a/buetow.org.sh +++ b/buetow.org.sh @@ -12,13 +12,13 @@ readonly DATE readonly SED source buetow.org.conf -source ./modules/assert.source.sh -source ./modules/atomfeed.source.sh -source ./modules/gemfeed.source.sh -source ./modules/generate.source.sh -source ./modules/html.source.sh -source ./modules/log.source.sh -source ./modules/md.source.sh +source ./packages/assert.source.sh +source ./packages/atomfeed.source.sh +source ./packages/gemfeed.source.sh +source ./packages/generate.source.sh +source ./packages/html.source.sh +source ./packages/log.source.sh +source ./packages/md.source.sh help () { cat < - 2021-05-16T18:34:25+01:00 + 2021-05-17T21:01:05+01:00 buetow.org feed Having fun with computers! @@ -354,7 +354,7 @@ fi

My urge to revamp my personal website

For some time I had to urge to revamp my personal website. Not to update the technology and the design of it but to update all the content (+ keep it current) and also to start a small tech blog again. So unconsciously I started to search for a good platform and/or software to do all of that in a KISS (keep it simple & stupid) way.

My still great Laptop running hot

-

Earlier this year (2021) I noticed that my 6 year old but still great Laptop started to become hot and slowed down while surfing the web. Also, the Laptop's fan became quite noisy. This is all due to the additional bloat such as JavaScript, excessive use of CSS, tracking cookies+pixels, ads and so on there was on the website.

+

Earlier this year (2021) I noticed that my almost 7 year old but still great Laptop started to become hot and slowed down while surfing the web. Also, the Laptop's fan became quite noisy. This is all due to the additional bloat such as JavaScript, excessive use of CSS, tracking cookies+pixels, ads and so on there was on the website.

All what I wanted was to read an interesting article but after a big advertising pop-up banner appeared and made everything worse I gave up and closed the browser tab.

Discovering the Gemini internet protocol

Around the same time I discovered a relatively new more lightweight protocol named Gemini which does not support all these CPU intensive features like HTML, JavaScript and CSS do. Also, tracking and ads is not supported by the Gemini protocol.

@@ -489,7 +489,7 @@ jgs\__/'---'\__/

Foreword

This text first was published in the german IT-Administrator computer Magazine. 3 years have passed since and I decided to publish it on my blog too.

https://www.admin-magazin.de/Das-Heft/2018/06/Realistische-Lasttests-mit-I-O-Riot
-

I havn't worked on I/O Riot for some time now, but all what is written here is still valid. I am still using I/O Riot to debug I/O issues and pattern once in a while, so by all means the tool is not obsolete yet. The tool even helped to resolve a major production incident at work involving I/O.

+

I havn't worked on I/O Riot for some time now, but all what is written here is still valid. I am still using I/O Riot to debug I/O issues and pattern once in a while, so by all means the tool is not obsolete yet. The tool even helped to resolve a major production incident at work caused by disk I/O.

I am eagerly looking forward to revamp I/O Riot so that it uses the new BPF Linux capabilities instead of plain old Systemtap (or alternatively: Newer versions of Systemtap can also use BPF as the backend I have learned). Also, when I wrote I/O Riot initially, I didn't have any experience with the Go programming language yet and therefore I wrote it in C. Once it gets revamped I might consider using Go instead of C as it would spare me from many segmentation faults and headaches during development ;-). I might also just stick to C for plain performance reasons and just refactor the code dealing with concurrency.

Pleace notice that some of the screenshots show the command "ioreplay" instead of "ioriot". That's because the name has changed after taking those.

The article

@@ -501,7 +501,7 @@ jgs\__/'---'\__/

Testing in the production environment: For these reasons, benchmarks are often carried out in the production environment. In order to derive value from this such tests are especially performed during peak hours when systems are under high load. However, testing on production systems is associated with risks and can lead to failure or loss of data without adequate protection.

Benchmarking the Email Cloud at Mimecast

For email archiving, Mimecast uses an internally developed microservice, which is operated directly on Linux-based storage systems. A storage cluster is divided into several replication volumes. Data is always replicated three times across two secure data centers. Customer data is automatically allocated to one or more volumes, depending on throughput, so that all volumes are automatically assigned the same load. Customer data is archived on conventional, but inexpensive hard disks with several terabytes of storage capacity each. I/O benchmarking proved difficult for all the reasons mentioned above. Furthermore, there are no ready-made tools for this purpose in the case of self-developed software. The service operates on many block devices simultaneously, which can make the RAID controller a bottleneck. None of the freely available benchmarking tools can test several block devices at the same time without extra effort. In addition, emails typically consist of many small files. Randomized access to many small files is particularly inefficient. In addition to many software adaptations, the hardware and operating system must also be optimized.

-

Mimecast encourages employees to be innovative and pursue their own ideas in the form of an internal competition, Pet Project. The goal of the pet project I/O Riot was to simplify OS and hardware level I/O benchmarking. The first prototype of I/O Riot was awarded an internal roadmap prize in the spring of 2017. A few months later, I/O Riot was used to reduce write latency in the storage clusters by about 50%. The improvement was first verified by I/O replay on a test system and then successively applied to all storage systems. I/O Riot was also used to resolve a production incident related to disk I/O load.

+

Mimecast encourages employees to be innovative and pursue their own ideas in the form of an internal competition, Pet Project. The goal of the pet project I/O Riot was to simplify OS and hardware level I/O benchmarking. The first prototype of I/O Riot was awarded an internal roadmap prize in the spring of 2017. A few months later, I/O Riot was used to reduce write latency in the storage clusters by about 50%. The improvement was first verified by I/O replay on a test system and then successively applied to all storage systems. I/O Riot was also used to resolve a production incident caused by disk I/O load.

Using I/O Riot

First, all I/O events are logged to a file on a production system with I/O Riot. It is then copied to a test system where all events are replayed in the same way. The crucial point here is that you can reproduce I/O patterns as they are found on a production system as often as you like on a test system. This results in the possibility of optimizing the set screws on the system after each run.

Installation

diff --git a/content/html/gemfeed/2018-06-01-realistic-load-testing-with-ioriot-for-linux.html b/content/html/gemfeed/2018-06-01-realistic-load-testing-with-ioriot-for-linux.html index 53ef5543..c201997f 100644 --- a/content/html/gemfeed/2018-06-01-realistic-load-testing-with-ioriot-for-linux.html +++ b/content/html/gemfeed/2018-06-01-realistic-load-testing-with-ioriot-for-linux.html @@ -67,7 +67,7 @@ jgs\__/'---'\__/

Foreword

This text first was published in the german IT-Administrator computer Magazine. 3 years have passed since and I decided to publish it on my blog too.

https://www.admin-magazin.de/Das-Heft/2018/06/Realistische-Lasttests-mit-I-O-Riot
-

I havn't worked on I/O Riot for some time now, but all what is written here is still valid. I am still using I/O Riot to debug I/O issues and pattern once in a while, so by all means the tool is not obsolete yet. The tool even helped to resolve a major production incident at work involving I/O.

+

I havn't worked on I/O Riot for some time now, but all what is written here is still valid. I am still using I/O Riot to debug I/O issues and pattern once in a while, so by all means the tool is not obsolete yet. The tool even helped to resolve a major production incident at work caused by disk I/O.

I am eagerly looking forward to revamp I/O Riot so that it uses the new BPF Linux capabilities instead of plain old Systemtap (or alternatively: Newer versions of Systemtap can also use BPF as the backend I have learned). Also, when I wrote I/O Riot initially, I didn't have any experience with the Go programming language yet and therefore I wrote it in C. Once it gets revamped I might consider using Go instead of C as it would spare me from many segmentation faults and headaches during development ;-). I might also just stick to C for plain performance reasons and just refactor the code dealing with concurrency.

Pleace notice that some of the screenshots show the command "ioreplay" instead of "ioriot". That's because the name has changed after taking those.

The article

@@ -79,7 +79,7 @@ jgs\__/'---'\__/

Testing in the production environment: For these reasons, benchmarks are often carried out in the production environment. In order to derive value from this such tests are especially performed during peak hours when systems are under high load. However, testing on production systems is associated with risks and can lead to failure or loss of data without adequate protection.

Benchmarking the Email Cloud at Mimecast

For email archiving, Mimecast uses an internally developed microservice, which is operated directly on Linux-based storage systems. A storage cluster is divided into several replication volumes. Data is always replicated three times across two secure data centers. Customer data is automatically allocated to one or more volumes, depending on throughput, so that all volumes are automatically assigned the same load. Customer data is archived on conventional, but inexpensive hard disks with several terabytes of storage capacity each. I/O benchmarking proved difficult for all the reasons mentioned above. Furthermore, there are no ready-made tools for this purpose in the case of self-developed software. The service operates on many block devices simultaneously, which can make the RAID controller a bottleneck. None of the freely available benchmarking tools can test several block devices at the same time without extra effort. In addition, emails typically consist of many small files. Randomized access to many small files is particularly inefficient. In addition to many software adaptations, the hardware and operating system must also be optimized.

-

Mimecast encourages employees to be innovative and pursue their own ideas in the form of an internal competition, Pet Project. The goal of the pet project I/O Riot was to simplify OS and hardware level I/O benchmarking. The first prototype of I/O Riot was awarded an internal roadmap prize in the spring of 2017. A few months later, I/O Riot was used to reduce write latency in the storage clusters by about 50%. The improvement was first verified by I/O replay on a test system and then successively applied to all storage systems. I/O Riot was also used to resolve a production incident related to disk I/O load.

+

Mimecast encourages employees to be innovative and pursue their own ideas in the form of an internal competition, Pet Project. The goal of the pet project I/O Riot was to simplify OS and hardware level I/O benchmarking. The first prototype of I/O Riot was awarded an internal roadmap prize in the spring of 2017. A few months later, I/O Riot was used to reduce write latency in the storage clusters by about 50%. The improvement was first verified by I/O replay on a test system and then successively applied to all storage systems. I/O Riot was also used to resolve a production incident caused by disk I/O load.

Using I/O Riot

First, all I/O events are logged to a file on a production system with I/O Riot. It is then copied to a test system where all events are replayed in the same way. The crucial point here is that you can reproduce I/O patterns as they are found on a production system as often as you like on a test system. This results in the possibility of optimizing the set screws on the system after each run.

Installation

diff --git a/content/html/gemfeed/2021-04-24-welcome-to-the-geminispace.html b/content/html/gemfeed/2021-04-24-welcome-to-the-geminispace.html index 60058422..2488ccbe 100644 --- a/content/html/gemfeed/2021-04-24-welcome-to-the-geminispace.html +++ b/content/html/gemfeed/2021-04-24-welcome-to-the-geminispace.html @@ -80,7 +80,7 @@ h2, h3 {

My urge to revamp my personal website

For some time I had to urge to revamp my personal website. Not to update the technology and the design of it but to update all the content (+ keep it current) and also to start a small tech blog again. So unconsciously I started to search for a good platform and/or software to do all of that in a KISS (keep it simple & stupid) way.

My still great Laptop running hot

-

Earlier this year (2021) I noticed that my 6 year old but still great Laptop started to become hot and slowed down while surfing the web. Also, the Laptop's fan became quite noisy. This is all due to the additional bloat such as JavaScript, excessive use of CSS, tracking cookies+pixels, ads and so on there was on the website.

+

Earlier this year (2021) I noticed that my almost 7 year old but still great Laptop started to become hot and slowed down while surfing the web. Also, the Laptop's fan became quite noisy. This is all due to the additional bloat such as JavaScript, excessive use of CSS, tracking cookies+pixels, ads and so on there was on the website.

All what I wanted was to read an interesting article but after a big advertising pop-up banner appeared and made everything worse I gave up and closed the browser tab.

Discovering the Gemini internet protocol

Around the same time I discovered a relatively new more lightweight protocol named Gemini which does not support all these CPU intensive features like HTML, JavaScript and CSS do. Also, tracking and ads is not supported by the Gemini protocol.

diff --git a/content/html/gemfeed/atom.xml b/content/html/gemfeed/atom.xml index d626a57f..b87afe87 100644 --- a/content/html/gemfeed/atom.xml +++ b/content/html/gemfeed/atom.xml @@ -1,6 +1,6 @@ - 2021-05-16T18:34:25+01:00 + 2021-05-17T21:01:05+01:00 buetow.org feed Having fun with computers! @@ -354,7 +354,7 @@ fi

My urge to revamp my personal website

For some time I had to urge to revamp my personal website. Not to update the technology and the design of it but to update all the content (+ keep it current) and also to start a small tech blog again. So unconsciously I started to search for a good platform and/or software to do all of that in a KISS (keep it simple & stupid) way.

My still great Laptop running hot

-

Earlier this year (2021) I noticed that my 6 year old but still great Laptop started to become hot and slowed down while surfing the web. Also, the Laptop's fan became quite noisy. This is all due to the additional bloat such as JavaScript, excessive use of CSS, tracking cookies+pixels, ads and so on there was on the website.

+

Earlier this year (2021) I noticed that my almost 7 year old but still great Laptop started to become hot and slowed down while surfing the web. Also, the Laptop's fan became quite noisy. This is all due to the additional bloat such as JavaScript, excessive use of CSS, tracking cookies+pixels, ads and so on there was on the website.

All what I wanted was to read an interesting article but after a big advertising pop-up banner appeared and made everything worse I gave up and closed the browser tab.

Discovering the Gemini internet protocol

Around the same time I discovered a relatively new more lightweight protocol named Gemini which does not support all these CPU intensive features like HTML, JavaScript and CSS do. Also, tracking and ads is not supported by the Gemini protocol.

@@ -489,7 +489,7 @@ jgs\__/'---'\__/

Foreword

This text first was published in the german IT-Administrator computer Magazine. 3 years have passed since and I decided to publish it on my blog too.

https://www.admin-magazin.de/Das-Heft/2018/06/Realistische-Lasttests-mit-I-O-Riot
-

I havn't worked on I/O Riot for some time now, but all what is written here is still valid. I am still using I/O Riot to debug I/O issues and pattern once in a while, so by all means the tool is not obsolete yet. The tool even helped to resolve a major production incident at work involving I/O.

+

I havn't worked on I/O Riot for some time now, but all what is written here is still valid. I am still using I/O Riot to debug I/O issues and pattern once in a while, so by all means the tool is not obsolete yet. The tool even helped to resolve a major production incident at work caused by disk I/O.

I am eagerly looking forward to revamp I/O Riot so that it uses the new BPF Linux capabilities instead of plain old Systemtap (or alternatively: Newer versions of Systemtap can also use BPF as the backend I have learned). Also, when I wrote I/O Riot initially, I didn't have any experience with the Go programming language yet and therefore I wrote it in C. Once it gets revamped I might consider using Go instead of C as it would spare me from many segmentation faults and headaches during development ;-). I might also just stick to C for plain performance reasons and just refactor the code dealing with concurrency.

Pleace notice that some of the screenshots show the command "ioreplay" instead of "ioriot". That's because the name has changed after taking those.

The article

@@ -501,7 +501,7 @@ jgs\__/'---'\__/

Testing in the production environment: For these reasons, benchmarks are often carried out in the production environment. In order to derive value from this such tests are especially performed during peak hours when systems are under high load. However, testing on production systems is associated with risks and can lead to failure or loss of data without adequate protection.

Benchmarking the Email Cloud at Mimecast

For email archiving, Mimecast uses an internally developed microservice, which is operated directly on Linux-based storage systems. A storage cluster is divided into several replication volumes. Data is always replicated three times across two secure data centers. Customer data is automatically allocated to one or more volumes, depending on throughput, so that all volumes are automatically assigned the same load. Customer data is archived on conventional, but inexpensive hard disks with several terabytes of storage capacity each. I/O benchmarking proved difficult for all the reasons mentioned above. Furthermore, there are no ready-made tools for this purpose in the case of self-developed software. The service operates on many block devices simultaneously, which can make the RAID controller a bottleneck. None of the freely available benchmarking tools can test several block devices at the same time without extra effort. In addition, emails typically consist of many small files. Randomized access to many small files is particularly inefficient. In addition to many software adaptations, the hardware and operating system must also be optimized.

-

Mimecast encourages employees to be innovative and pursue their own ideas in the form of an internal competition, Pet Project. The goal of the pet project I/O Riot was to simplify OS and hardware level I/O benchmarking. The first prototype of I/O Riot was awarded an internal roadmap prize in the spring of 2017. A few months later, I/O Riot was used to reduce write latency in the storage clusters by about 50%. The improvement was first verified by I/O replay on a test system and then successively applied to all storage systems. I/O Riot was also used to resolve a production incident related to disk I/O load.

+

Mimecast encourages employees to be innovative and pursue their own ideas in the form of an internal competition, Pet Project. The goal of the pet project I/O Riot was to simplify OS and hardware level I/O benchmarking. The first prototype of I/O Riot was awarded an internal roadmap prize in the spring of 2017. A few months later, I/O Riot was used to reduce write latency in the storage clusters by about 50%. The improvement was first verified by I/O replay on a test system and then successively applied to all storage systems. I/O Riot was also used to resolve a production incident caused by disk I/O load.

Using I/O Riot

First, all I/O events are logged to a file on a production system with I/O Riot. It is then copied to a test system where all events are replayed in the same way. The crucial point here is that you can reproduce I/O patterns as they are found on a production system as often as you like on a test system. This results in the possibility of optimizing the set screws on the system after each run.

Installation

diff --git a/content/md/gemfeed/2018-06-01-realistic-load-testing-with-ioriot-for-linux.md b/content/md/gemfeed/2018-06-01-realistic-load-testing-with-ioriot-for-linux.md index 260a0368..77ee72ee 100644 --- a/content/md/gemfeed/2018-06-01-realistic-load-testing-with-ioriot-for-linux.md +++ b/content/md/gemfeed/2018-06-01-realistic-load-testing-with-ioriot-for-linux.md @@ -19,7 +19,7 @@ This text first was published in the german IT-Administrator computer Magazine. [https://www.admin-magazin.de/Das-Heft/2018/06/Realistische-Lasttests-mit-I-O-Riot](https://www.admin-magazin.de/Das-Heft/2018/06/Realistische-Lasttests-mit-I-O-Riot) -I havn't worked on I/O Riot for some time now, but all what is written here is still valid. I am still using I/O Riot to debug I/O issues and pattern once in a while, so by all means the tool is not obsolete yet. The tool even helped to resolve a major production incident at work involving I/O. +I havn't worked on I/O Riot for some time now, but all what is written here is still valid. I am still using I/O Riot to debug I/O issues and pattern once in a while, so by all means the tool is not obsolete yet. The tool even helped to resolve a major production incident at work caused by disk I/O. I am eagerly looking forward to revamp I/O Riot so that it uses the new BPF Linux capabilities instead of plain old Systemtap (or alternatively: Newer versions of Systemtap can also use BPF as the backend I have learned). Also, when I wrote I/O Riot initially, I didn't have any experience with the Go programming language yet and therefore I wrote it in C. Once it gets revamped I might consider using Go instead of C as it would spare me from many segmentation faults and headaches during development ;-). I might also just stick to C for plain performance reasons and just refactor the code dealing with concurrency. @@ -43,7 +43,7 @@ Testing in the production environment: For these reasons, benchmarks are often c For email archiving, Mimecast uses an internally developed microservice, which is operated directly on Linux-based storage systems. A storage cluster is divided into several replication volumes. Data is always replicated three times across two secure data centers. Customer data is automatically allocated to one or more volumes, depending on throughput, so that all volumes are automatically assigned the same load. Customer data is archived on conventional, but inexpensive hard disks with several terabytes of storage capacity each. I/O benchmarking proved difficult for all the reasons mentioned above. Furthermore, there are no ready-made tools for this purpose in the case of self-developed software. The service operates on many block devices simultaneously, which can make the RAID controller a bottleneck. None of the freely available benchmarking tools can test several block devices at the same time without extra effort. In addition, emails typically consist of many small files. Randomized access to many small files is particularly inefficient. In addition to many software adaptations, the hardware and operating system must also be optimized. -Mimecast encourages employees to be innovative and pursue their own ideas in the form of an internal competition, Pet Project. The goal of the pet project I/O Riot was to simplify OS and hardware level I/O benchmarking. The first prototype of I/O Riot was awarded an internal roadmap prize in the spring of 2017. A few months later, I/O Riot was used to reduce write latency in the storage clusters by about 50%. The improvement was first verified by I/O replay on a test system and then successively applied to all storage systems. I/O Riot was also used to resolve a production incident related to disk I/O load. +Mimecast encourages employees to be innovative and pursue their own ideas in the form of an internal competition, Pet Project. The goal of the pet project I/O Riot was to simplify OS and hardware level I/O benchmarking. The first prototype of I/O Riot was awarded an internal roadmap prize in the spring of 2017. A few months later, I/O Riot was used to reduce write latency in the storage clusters by about 50%. The improvement was first verified by I/O replay on a test system and then successively applied to all storage systems. I/O Riot was also used to resolve a production incident caused by disk I/O load. ## Using I/O Riot diff --git a/content/md/gemfeed/2021-04-24-welcome-to-the-geminispace.md b/content/md/gemfeed/2021-04-24-welcome-to-the-geminispace.md index fcc77ab5..8b6e229f 100644 --- a/content/md/gemfeed/2021-04-24-welcome-to-the-geminispace.md +++ b/content/md/gemfeed/2021-04-24-welcome-to-the-geminispace.md @@ -36,7 +36,7 @@ For some time I had to urge to revamp my personal website. Not to update the tec ### My still great Laptop running hot -Earlier this year (2021) I noticed that my 6 year old but still great Laptop started to become hot and slowed down while surfing the web. Also, the Laptop's fan became quite noisy. This is all due to the additional bloat such as JavaScript, excessive use of CSS, tracking cookies+pixels, ads and so on there was on the website. +Earlier this year (2021) I noticed that my almost 7 year old but still great Laptop started to become hot and slowed down while surfing the web. Also, the Laptop's fan became quite noisy. This is all due to the additional bloat such as JavaScript, excessive use of CSS, tracking cookies+pixels, ads and so on there was on the website. All what I wanted was to read an interesting article but after a big advertising pop-up banner appeared and made everything worse I gave up and closed the browser tab. diff --git a/modules/assert.source.sh b/modules/assert.source.sh deleted file mode 100644 index d7c507a4..00000000 --- a/modules/assert.source.sh +++ /dev/null @@ -1,30 +0,0 @@ -assert::equals () { - local -r result="$1"; shift - local -r expected="$1"; shift - local -r callee=${FUNCNAME[1]} - - if [ "$result" != "$expected" ]; then - cat < "$atom_file.tmp" - - - $now - $DOMAIN feed - $SUBTITLE - - - gemini://$DOMAIN/ -ATOMHEADER - - while read -r gmi_file; do - # Load cached meta information about the post. - source <(atomfeed::meta "$gemfeed_dir/$gmi_file") - # Get HTML content for the feed - local content="$(atomfeed::content "$gemfeed_dir/$gmi_file")" - - assert::not_empty meta_title "$meta_title" - assert::not_empty meta_date "$meta_date" - assert::not_empty meta_author "$meta_author" - assert::not_empty meta_email "$meta_email" - assert::not_empty meta_summary "$meta_summary" - assert::not_empty content "$content" - - cat <> "$atom_file.tmp" - - $meta_title - - gemini://$DOMAIN/gemfeed/$gmi_file - $meta_date - - $meta_author - $meta_email - - $meta_summary - -
- $content -
-
-
-ATOMENTRY - done < <(gemfeed::get_posts | head -n $ATOM_MAX_ENTRIES) - - cat <> "$atom_file.tmp" -
-ATOMFOOTER - - # Delete the 3rd line of the atom feeds (global feed update timestamp) - if ! diff -u <($SED 3d "$atom_file") <($SED 3d "$atom_file.tmp"); then - log INFO 'Feed got something new!' - mv "$atom_file.tmp" "$atom_file" - test "$ADD_GIT" == yes && git add "$atom_file" - else - log INFO 'Nothing really new in the feed' - rm "$atom_file.tmp" - fi -} diff --git a/modules/gemfeed.source.sh b/modules/gemfeed.source.sh deleted file mode 100644 index c68c5070..00000000 --- a/modules/gemfeed.source.sh +++ /dev/null @@ -1,51 +0,0 @@ -# Filters out blog posts from other files in the gemfeed dir. -gemfeed::get_posts () { - local -r gemfeed_dir="$CONTENT_DIR/gemtext/gemfeed" - local -r gmi_pattern='^[0-9]{4}-[0-9]{2}-[0-9]{2}-.*\.gmi$' - local -r draft_pattern='\.draft\.gmi$' - - ls "$gemfeed_dir" | grep -E "$gmi_pattern" | grep -E -v "$draft_pattern" | sort -r -} - -# Adds the links from gemfeed/index.gmi to the main index site. -gemfeed::updatemainindex () { - local -r index_gmi="$CONTENT_DIR/gemtext/index.gmi" - local -r gemfeed_dir="$CONTENT_DIR/gemtext/gemfeed" - - log VERBOSE "Updating $index_gmi with posts from $gemfeed_dir" - - # Remove old gemfeeds from main index - $SED '/^=> .\/gemfeed\/[0-9].* - .*/d;' "$index_gmi" > "$index_gmi.tmp" - # Add current gemfeeds to main index - $SED -n '/^=> / { s| ./| ./gemfeed/|; p; }' "$gemfeed_dir/index.gmi" >> "$index_gmi.tmp" - - mv "$index_gmi.tmp" "$index_gmi" - test "$ADD_GIT" == yes && git add "$index_gmi" -} - -# This generates a index.gmi in the ./gemfeed subdir. -gemfeed::generate () { - local -r gemfeed_dir="$CONTENT_DIR/gemtext/gemfeed" - log INFO "Generating Gemfeed index for $gemfeed_dir" - -cat < "$gemfeed_dir/index.gmi.tmp" -# $DOMAIN's Gemfeed - -## $SUBTITLE - -GEMFEED - - gemfeed::get_posts | while read -r gmi_file; do - # Extract first heading as post title. - local title=$($SED -n '/^# / { s/# //; p; q; }' "$gemfeed_dir/$gmi_file" | tr '"' "'") - # Extract the date from the file name. - local filename_date=$(basename "$gemfeed_dir/$gmi_file" | cut -d- -f1,2,3) - - echo "=> ./$gmi_file $filename_date - $title" >> "$gemfeed_dir/index.gmi.tmp" - done - - mv "$gemfeed_dir/index.gmi.tmp" "$gemfeed_dir/index.gmi" - test "$ADD_GIT" == yes && git add "$gemfeed_dir/index.gmi" - - gemfeed::updatemainindex -} diff --git a/modules/generate.source.sh b/modules/generate.source.sh deleted file mode 100644 index 171c31cf..00000000 --- a/modules/generate.source.sh +++ /dev/null @@ -1,128 +0,0 @@ -generate::make_link () { - local -r what="$1"; shift - local -r line="${1/=> }"; shift - local link - local descr - - while read -r token; do - if [ -z "$link" ]; then - link="$token" - elif [ -z "$descr" ]; then - descr="$token" - else - descr="$descr $token" - fi - done < <(echo "$line" | tr ' ' '\n') - - if grep -E -q "$IMAGE_PATTERN" <<< "$link"; then - if [ "$what" == md ]; then - md::make_img "$link" "$descr" - else - html::make_img "$link" "$(html::special "$descr")" - fi - return - fi - - if [ "$what" == md ]; then - md::make_link "$link" "$descr" - else - html::make_link "$link" "$(html::special "$descr")" - fi -} - -generate::fromgmi_ () { - local -r src="$1"; shift - local -r format="$1"; shift - local dest=${src/gemtext/$format} - dest=${dest/.gmi/.$format} - local dest_dir=$(dirname "$dest") - - test ! -d "$dest_dir" && mkdir -p "$dest_dir" - if [ "$format" == html ]; then - cat header.html.part > "$dest.tmp" - html::fromgmi < "$src" >> "$dest.tmp" - cat footer.html.part >> "$dest.tmp" - elif [ "$format" == md ]; then - md::fromgmi < "$src" >> "$dest.tmp" - fi - - mv "$dest.tmp" "$dest" - test "$ADD_GIT" == yes && git add "$dest" -} - -generate::fromgmi_add_docs () { - local -r src="$1"; shift - local -r format="$1"; shift - local -r dest=${src/gemtext/$format} - local -r dest_dir=$(dirname "$dest") - - test ! -d "$dest_dir" && mkdir -p "$dest_dir" - cp "$src" "$dest" - test "$ADD_GIT" == yes && git add "$dest" -} - -generate::convert_gmi_atom_to_html_atom () { - local -r format="$1"; shift - test "$format" != html && return - - log INFO 'Converting Gemtext Atom feed to HTML Atom feed' - - $SED 's|.gmi|.html|g; s|gemini://|https://|g' \ - < $CONTENT_DIR/gemtext/gemfeed/atom.xml \ - > $CONTENT_DIR/html/gemfeed/atom.xml - - test "$ADD_GIT" == yes && git add $CONTENT_DIR/html/gemfeed/atom.xml -} - -generate::fromgmi_cleanup () { - local -r src="$1"; shift - local -r format="$1"; shift - local dest=${src/.$format/.gmi} - dest=${dest/$format/gemtext} - - test ! -f "$dest" && test "$ADD_GIT" == yes && git rm "$src" -} - -generate::fromgmi () { - local -i num_gmi_files=0 - local -i num_doc_files=0 - - log INFO "Generating $* from Gemtext" - - while read -r src; do - (( num_gmi_files++ )) - for format in "$@"; do - generate::fromgmi_ "$src" "$format" - done - done < <(find "$CONTENT_DIR/gemtext" -type f -name \*.gmi) - - log INFO "Converted $num_gmi_files Gemtext files" - - # Add non-.gmi files to html dir. - log VERBOSE "Adding other docs to $*" - - while read -r src; do - (( num_doc_files++ )) - for format in "$@"; do - generate::fromgmi_add_docs "$src" "$format" - done - done < <(find "$CONTENT_DIR/gemtext" -type f | grep -E -v '(.gmi|atom.xml|.tmp)$') - - log INFO "Added $num_doc_files other documents to each of $*" - - # Add atom feed for HTML - for format in "$@"; do - generate::convert_gmi_atom_to_html_atom "$format" - done - - # Remove obsolete files from ./html/ - for format in "$@"; do - find "$CONTENT_DIR/$format" -type f | while read -r src; do - generate::fromgmi_cleanup "$src" "$format" - done - done - - for format in "$@"; do - log INFO "$format can be found in $CONTENT_DIR/$format now" - done -} diff --git a/modules/html.source.sh b/modules/html.source.sh deleted file mode 100644 index 3eb2ee4e..00000000 --- a/modules/html.source.sh +++ /dev/null @@ -1,146 +0,0 @@ -html::special () { - $SED ' - s|\&|\&|g; - s|<|\<|g; - s|>|\>|g; - ' <<< "$@" -} - -html::make_paragraph () { - local -r text="$1"; shift - test -n "$text" && echo "

$(html::special "$text")

" -} - -html::make_heading () { - local -r text=$($SED -E 's/^#+ //' <<< "$1"); shift - local -r level="$1"; shift - - echo "$(html::special "$text")" -} - -html::make_quote () { - local -r quote="${1/> }" - echo "

$(html::special "$quote")

" -} - -html::make_img () { - local link="$1"; shift - local descr="$1"; shift - - if [ -z "$descr" ]; then - echo -n "" - else - echo -n "$descr:" - echo -n "\"$descr\"" - fi - - echo "
" -} - -html::make_link () { - local link="$1"; shift - local descr="$1"; shift - - grep -F -q '://' <<< "$link" || link=${link/.gmi/.html} - test -z "$descr" && descr="$link" - echo "$descr
" -} - -html::fromgmi () { - local -i is_list=0 - local -i is_plain=0 - - while IFS='' read -r line; do - if [ $is_list -eq 1 ]; then - if [[ "$line" == '* '* ]]; then - echo "
  • $(html::special "${line/\* /}")
  • " - else - is_list=0 - echo "" - fi - continue - - elif [ $is_plain -eq 1 ]; then - if [[ "$line" == '```'* ]]; then - echo "" - is_plain=0 - else - html::special "$line" - fi - continue - fi - - case "$line" in - '* '*) - is_list=1 - echo "
      " - echo "
    • ${line/\* /}
    • " - ;; - '```'*) - is_plain=1 - echo "
      "
      -                ;;
      -            '# '*)
      -                html::make_heading "$line" 1
      -                ;;
      -            '## '*)
      -                html::make_heading "$line" 2
      -                ;;
      -            '### '*)
      -                html::make_heading "$line" 3
      -                ;;
      -            '> '*)
      -                html::make_quote "$line"
      -                ;;
      -            '=> '*)
      -                generate::make_link html "$line"
      -                ;;
      -            *)
      -                html::make_paragraph "$line"
      -                ;;
      -        esac
      -    done
      -}
      -
      -html::test () {
      -    local line='Hello world! This is a paragraph.'
      -    assert::equals "$(html::make_paragraph "$line")" '

      Hello world! This is a paragraph.

      ' - - line='' - assert::equals "$(html::make_paragraph "$line")" '' - - line='Foo &<>& Bar!' - assert::equals "$(html::make_paragraph "$line")" '

      Foo &<>& Bar!

      ' - - line='# Header 1' - assert::equals "$(html::make_heading "$line" 1)" '

      Header 1

      ' - - line='## Header 2' - assert::equals "$(html::make_heading "$line" 2)" '

      Header 2

      ' - - line='### Header 3' - assert::equals "$(html::make_heading "$line" 3)" '

      Header 3

      ' - - line='> This is a quote' - assert::equals "$(html::make_quote "$line")" '

      This is a quote

      ' - - line='=> https://example.org' - assert::equals "$(generate::make_link html "$line")" \ - 'https://example.org
      ' - - line='=> index.gmi' - assert::equals "$(generate::make_link html "$line")" \ - 'index.html
      ' - - line='=> http://example.org Description of the link' - assert::equals "$(generate::make_link html "$line")" \ - 'Description of the link
      ' - - line='=> http://example.org/image.png' - assert::equals "$(generate::make_link html "$line")" \ - '
      ' - - line='=> http://example.org/image.png Image description' - assert::equals "$(generate::make_link html "$line")" \ - 'Image description:Image description
      ' -} diff --git a/modules/log.source.sh b/modules/log.source.sh deleted file mode 100644 index 55d693ec..00000000 --- a/modules/log.source.sh +++ /dev/null @@ -1,26 +0,0 @@ -log () { - local -r level="$1"; shift - - for message in "$@"; do - echo "$message" - done | log::_pipe "$level" -} - -log::pipe () { - log::_pipe "$1" -} - -log::_pipe () { - local -r level="$1"; shift - - if [[ "$level" == VERBOSE && -z "$LOG_VERBOSE" ]]; then - return - fi - - local -r callee=${FUNCNAME[2]} - local -r stamp=$($DATE +%Y%m%d-%H%M%S) - - while read -r line; do - echo "$level|$stamp|$callee|$line" >&2 - done -} diff --git a/modules/md.source.sh b/modules/md.source.sh deleted file mode 100644 index 197bdcf7..00000000 --- a/modules/md.source.sh +++ /dev/null @@ -1,55 +0,0 @@ -md::make_img () { - local link="$1"; shift - local descr="$1"; shift - - if [ -z "$descr" ]; then - echo "[![$link]($link)]($link) " - else - echo "[![$descr]($link \"$descr\")]($link) " - fi -} - -md::make_link () { - local link="$1"; shift - local descr="$1"; shift - - grep -F -q '://' <<< "$link" || link=${link/.gmi/.md} - test -z "$descr" && descr="$link" - - echo "[$descr]($link) " -} - -md::test () { - local line='=> https://example.org' - assert::equals "$(generate::make_link md "$line")" \ - '[https://example.org](https://example.org) ' - - line='=> index.md' - assert::equals "$(generate::make_link md "$line")" \ - '[index.md](index.md) ' - - line='=> http://example.org Description of the link' - assert::equals "$(generate::make_link md "$line")" \ - '[Description of the link](http://example.org) ' - - line='=> http://example.org/image.png' - assert::equals "$(generate::make_link md "$line")" \ - '[![http://example.org/image.png](http://example.org/image.png)](http://example.org/image.png) ' - - line='=> http://example.org/image.png Image description' - assert::equals "$(generate::make_link md "$line")" \ - '[![Image description](http://example.org/image.png "Image description")](http://example.org/image.png) ' -} - -md::fromgmi () { - while IFS='' read -r line; do - case "$line" in - '=> '*) - generate::make_link md "$line" - ;; - *) - echo "$line" - ;; - esac - done -} diff --git a/packages/assert.source.sh b/packages/assert.source.sh new file mode 100644 index 00000000..551d1623 --- /dev/null +++ b/packages/assert.source.sh @@ -0,0 +1,30 @@ +assert::equals () { + local -r result="$1"; shift + local -r expected="$1"; shift + local -r callee=${FUNCNAME[1]} + + if [[ "$result" != "$expected" ]]; then + cat < "$atom_file.tmp" + + + $now + $DOMAIN feed + $SUBTITLE + + + gemini://$DOMAIN/ +ATOMHEADER + + while read -r gmi_file; do + # Load cached meta information about the post. + source <(atomfeed::meta "$gemfeed_dir/$gmi_file") + # Get HTML content for the feed + local content="$(atomfeed::content "$gemfeed_dir/$gmi_file")" + + assert::not_empty meta_title "$meta_title" + assert::not_empty meta_date "$meta_date" + assert::not_empty meta_author "$meta_author" + assert::not_empty meta_email "$meta_email" + assert::not_empty meta_summary "$meta_summary" + assert::not_empty content "$content" + + cat <> "$atom_file.tmp" + + $meta_title + + gemini://$DOMAIN/gemfeed/$gmi_file + $meta_date + + $meta_author + $meta_email + + $meta_summary + +
      + $content +
      +
      +
      +ATOMENTRY + done < <(gemfeed::get_posts | head -n $ATOM_MAX_ENTRIES) + + cat <> "$atom_file.tmp" +
      +ATOMFOOTER + + # Delete the 3rd line of the atom feeds (global feed update timestamp) + if ! diff -u <($SED 3d "$atom_file") <($SED 3d "$atom_file.tmp"); then + log INFO 'Feed got something new!' + mv "$atom_file.tmp" "$atom_file" + test "$ADD_GIT" == yes && git add "$atom_file" + else + log INFO 'Nothing really new in the feed' + rm "$atom_file.tmp" + fi +} diff --git a/packages/gemfeed.source.sh b/packages/gemfeed.source.sh new file mode 100644 index 00000000..c68c5070 --- /dev/null +++ b/packages/gemfeed.source.sh @@ -0,0 +1,51 @@ +# Filters out blog posts from other files in the gemfeed dir. +gemfeed::get_posts () { + local -r gemfeed_dir="$CONTENT_DIR/gemtext/gemfeed" + local -r gmi_pattern='^[0-9]{4}-[0-9]{2}-[0-9]{2}-.*\.gmi$' + local -r draft_pattern='\.draft\.gmi$' + + ls "$gemfeed_dir" | grep -E "$gmi_pattern" | grep -E -v "$draft_pattern" | sort -r +} + +# Adds the links from gemfeed/index.gmi to the main index site. +gemfeed::updatemainindex () { + local -r index_gmi="$CONTENT_DIR/gemtext/index.gmi" + local -r gemfeed_dir="$CONTENT_DIR/gemtext/gemfeed" + + log VERBOSE "Updating $index_gmi with posts from $gemfeed_dir" + + # Remove old gemfeeds from main index + $SED '/^=> .\/gemfeed\/[0-9].* - .*/d;' "$index_gmi" > "$index_gmi.tmp" + # Add current gemfeeds to main index + $SED -n '/^=> / { s| ./| ./gemfeed/|; p; }' "$gemfeed_dir/index.gmi" >> "$index_gmi.tmp" + + mv "$index_gmi.tmp" "$index_gmi" + test "$ADD_GIT" == yes && git add "$index_gmi" +} + +# This generates a index.gmi in the ./gemfeed subdir. +gemfeed::generate () { + local -r gemfeed_dir="$CONTENT_DIR/gemtext/gemfeed" + log INFO "Generating Gemfeed index for $gemfeed_dir" + +cat < "$gemfeed_dir/index.gmi.tmp" +# $DOMAIN's Gemfeed + +## $SUBTITLE + +GEMFEED + + gemfeed::get_posts | while read -r gmi_file; do + # Extract first heading as post title. + local title=$($SED -n '/^# / { s/# //; p; q; }' "$gemfeed_dir/$gmi_file" | tr '"' "'") + # Extract the date from the file name. + local filename_date=$(basename "$gemfeed_dir/$gmi_file" | cut -d- -f1,2,3) + + echo "=> ./$gmi_file $filename_date - $title" >> "$gemfeed_dir/index.gmi.tmp" + done + + mv "$gemfeed_dir/index.gmi.tmp" "$gemfeed_dir/index.gmi" + test "$ADD_GIT" == yes && git add "$gemfeed_dir/index.gmi" + + gemfeed::updatemainindex +} diff --git a/packages/generate.source.sh b/packages/generate.source.sh new file mode 100644 index 00000000..0f07af56 --- /dev/null +++ b/packages/generate.source.sh @@ -0,0 +1,128 @@ +generate::make_link () { + local -r what="$1"; shift + local -r line="${1/=> }"; shift + local link + local descr + + while read -r token; do + if [ -z "$link" ]; then + link="$token" + elif [ -z "$descr" ]; then + descr="$token" + else + descr="$descr $token" + fi + done < <(echo "$line" | tr ' ' '\n') + + if grep -E -q "$IMAGE_PATTERN" <<< "$link"; then + if [[ "$what" == md ]]; then + md::make_img "$link" "$descr" + else + html::make_img "$link" "$(html::special "$descr")" + fi + return + fi + + if [[ "$what" == md ]]; then + md::make_link "$link" "$descr" + else + html::make_link "$link" "$(html::special "$descr")" + fi +} + +generate::fromgmi_ () { + local -r src="$1"; shift + local -r format="$1"; shift + local dest=${src/gemtext/$format} + dest=${dest/.gmi/.$format} + local dest_dir=$(dirname "$dest") + + test ! -d "$dest_dir" && mkdir -p "$dest_dir" + if [[ "$format" == html ]]; then + cat header.html.part > "$dest.tmp" + html::fromgmi < "$src" >> "$dest.tmp" + cat footer.html.part >> "$dest.tmp" + elif [[ "$format" == md ]]; then + md::fromgmi < "$src" >> "$dest.tmp" + fi + + mv "$dest.tmp" "$dest" + test "$ADD_GIT" == yes && git add "$dest" +} + +generate::fromgmi_add_docs () { + local -r src="$1"; shift + local -r format="$1"; shift + local -r dest=${src/gemtext/$format} + local -r dest_dir=$(dirname "$dest") + + test ! -d "$dest_dir" && mkdir -p "$dest_dir" + cp "$src" "$dest" + test "$ADD_GIT" == yes && git add "$dest" +} + +generate::convert_gmi_atom_to_html_atom () { + local -r format="$1"; shift + test "$format" != html && return + + log INFO 'Converting Gemtext Atom feed to HTML Atom feed' + + $SED 's|.gmi|.html|g; s|gemini://|https://|g' \ + < $CONTENT_DIR/gemtext/gemfeed/atom.xml \ + > $CONTENT_DIR/html/gemfeed/atom.xml + + test "$ADD_GIT" == yes && git add "$CONTENT_DIR/html/gemfeed/atom.xml" +} + +generate::fromgmi_cleanup () { + local -r src="$1"; shift + local -r format="$1"; shift + local dest=${src/.$format/.gmi} + dest=${dest/$format/gemtext} + + test ! -f "$dest" && test "$ADD_GIT" == yes && git rm "$src" +} + +generate::fromgmi () { + local -i num_gmi_files=0 + local -i num_doc_files=0 + + log INFO "Generating $* from Gemtext" + + while read -r src; do + (( num_gmi_files++ )) + for format in "$@"; do + generate::fromgmi_ "$src" "$format" + done + done < <(find "$CONTENT_DIR/gemtext" -type f -name \*.gmi) + + log INFO "Converted $num_gmi_files Gemtext files" + + # Add non-.gmi files to html dir. + log VERBOSE "Adding other docs to $*" + + while read -r src; do + (( num_doc_files++ )) + for format in "$@"; do + generate::fromgmi_add_docs "$src" "$format" + done + done < <(find "$CONTENT_DIR/gemtext" -type f | grep -E -v '(.gmi|atom.xml|.tmp)$') + + log INFO "Added $num_doc_files other documents to each of $*" + + # Add atom feed for HTML + for format in "$@"; do + generate::convert_gmi_atom_to_html_atom "$format" + done + + # Remove obsolete files from ./html/ + for format in "$@"; do + find "$CONTENT_DIR/$format" -type f | while read -r src; do + generate::fromgmi_cleanup "$src" "$format" + done + done + + for format in "$@"; do + log INFO "$format can be found in $CONTENT_DIR/$format now" + done +} diff --git a/packages/html.source.sh b/packages/html.source.sh new file mode 100644 index 00000000..d8d2fc66 --- /dev/null +++ b/packages/html.source.sh @@ -0,0 +1,145 @@ +html::special () { + $SED ' + s|\&|\&|g; + s|<|\<|g; + s|>|\>|g; + ' <<< "$@" +} + +html::make_paragraph () { + local -r text="$1"; shift + test -n "$text" && echo "

      $(html::special "$text")

      " +} + +html::make_heading () { + local -r text=$($SED -E 's/^#+ //' <<< "$1"); shift + local -r level="$1"; shift + echo "$(html::special "$text")" +} + +html::make_quote () { + local -r quote="${1/> }" + echo "

      $(html::special "$quote")

      " +} + +html::make_img () { + local link="$1"; shift + local descr="$1"; shift + + if [ -z "$descr" ]; then + echo -n "" + else + echo -n "$descr:" + echo -n "\"$descr\"" + fi + + echo "
      " +} + +html::make_link () { + local link="$1"; shift + local descr="$1"; shift + + grep -F -q '://' <<< "$link" || link=${link/.gmi/.html} + test -z "$descr" && descr="$link" + echo "$descr
      " +} + +html::fromgmi () { + local is_list=no + local is_plain=no + + while IFS='' read -r line; do + if [[ "$is_list" == yes ]]; then + if [[ "$line" == '* '* ]]; then + echo "
    • $(html::special "${line/\* /}")
    • " + else + is_list=no + echo "
    " + fi + continue + + elif [[ "$is_plain" == yes ]]; then + if [[ "$line" == '```'* ]]; then + echo "" + is_plain=no + else + html::special "$line" + fi + continue + fi + + case "$line" in + '* '*) + is_list=yes + echo "
      " + echo "
    • ${line/\* /}
    • " + ;; + '```'*) + is_plain=yes + echo "
      "
      +                ;;
      +            '# '*)
      +                html::make_heading "$line" 1
      +                ;;
      +            '## '*)
      +                html::make_heading "$line" 2
      +                ;;
      +            '### '*)
      +                html::make_heading "$line" 3
      +                ;;
      +            '> '*)
      +                html::make_quote "$line"
      +                ;;
      +            '=> '*)
      +                generate::make_link html "$line"
      +                ;;
      +            *)
      +                html::make_paragraph "$line"
      +                ;;
      +        esac
      +    done
      +}
      +
      +html::test () {
      +    local line='Hello world! This is a paragraph.'
      +    assert::equals "$(html::make_paragraph "$line")" '

      Hello world! This is a paragraph.

      ' + + line='' + assert::equals "$(html::make_paragraph "$line")" '' + + line='Foo &<>& Bar!' + assert::equals "$(html::make_paragraph "$line")" '

      Foo &<>& Bar!

      ' + + line='# Header 1' + assert::equals "$(html::make_heading "$line" 1)" '

      Header 1

      ' + + line='## Header 2' + assert::equals "$(html::make_heading "$line" 2)" '

      Header 2

      ' + + line='### Header 3' + assert::equals "$(html::make_heading "$line" 3)" '

      Header 3

      ' + + line='> This is a quote' + assert::equals "$(html::make_quote "$line")" '

      This is a quote

      ' + + line='=> https://example.org' + assert::equals "$(generate::make_link html "$line")" \ + 'https://example.org
      ' + + line='=> index.gmi' + assert::equals "$(generate::make_link html "$line")" \ + 'index.html
      ' + + line='=> http://example.org Description of the link' + assert::equals "$(generate::make_link html "$line")" \ + 'Description of the link
      ' + + line='=> http://example.org/image.png' + assert::equals "$(generate::make_link html "$line")" \ + '
      ' + + line='=> http://example.org/image.png Image description' + assert::equals "$(generate::make_link html "$line")" \ + 'Image description:Image description
      ' +} diff --git a/packages/log.source.sh b/packages/log.source.sh new file mode 100644 index 00000000..55d693ec --- /dev/null +++ b/packages/log.source.sh @@ -0,0 +1,26 @@ +log () { + local -r level="$1"; shift + + for message in "$@"; do + echo "$message" + done | log::_pipe "$level" +} + +log::pipe () { + log::_pipe "$1" +} + +log::_pipe () { + local -r level="$1"; shift + + if [[ "$level" == VERBOSE && -z "$LOG_VERBOSE" ]]; then + return + fi + + local -r callee=${FUNCNAME[2]} + local -r stamp=$($DATE +%Y%m%d-%H%M%S) + + while read -r line; do + echo "$level|$stamp|$callee|$line" >&2 + done +} diff --git a/packages/md.source.sh b/packages/md.source.sh new file mode 100644 index 00000000..197bdcf7 --- /dev/null +++ b/packages/md.source.sh @@ -0,0 +1,55 @@ +md::make_img () { + local link="$1"; shift + local descr="$1"; shift + + if [ -z "$descr" ]; then + echo "[![$link]($link)]($link) " + else + echo "[![$descr]($link \"$descr\")]($link) " + fi +} + +md::make_link () { + local link="$1"; shift + local descr="$1"; shift + + grep -F -q '://' <<< "$link" || link=${link/.gmi/.md} + test -z "$descr" && descr="$link" + + echo "[$descr]($link) " +} + +md::test () { + local line='=> https://example.org' + assert::equals "$(generate::make_link md "$line")" \ + '[https://example.org](https://example.org) ' + + line='=> index.md' + assert::equals "$(generate::make_link md "$line")" \ + '[index.md](index.md) ' + + line='=> http://example.org Description of the link' + assert::equals "$(generate::make_link md "$line")" \ + '[Description of the link](http://example.org) ' + + line='=> http://example.org/image.png' + assert::equals "$(generate::make_link md "$line")" \ + '[![http://example.org/image.png](http://example.org/image.png)](http://example.org/image.png) ' + + line='=> http://example.org/image.png Image description' + assert::equals "$(generate::make_link md "$line")" \ + '[![Image description](http://example.org/image.png "Image description")](http://example.org/image.png) ' +} + +md::fromgmi () { + while IFS='' read -r line; do + case "$line" in + '=> '*) + generate::make_link md "$line" + ;; + *) + echo "$line" + ;; + esac + done +} -- cgit v1.2.3