Only after a period of tossing and turning can we become more clear about the boundaries of our abilities and where our needs lie. The goal of this article is to help more people minimize the cost of tossing.

Next, I will follow the cycle of “Practice → System → Requirements → Tools → Practice” to introduce the basic way of thinking in each session and show the spectrum of options available, hoping that this discussion will help people help understand the logic of the tool and build a workflow that suits them more effectively.

1. Practice → System: Viewing Workflows with Systems Thinking

If you’ve ever worked with colleagues like this – who see their work as isolated tasks, completing only what is assigned to them without considering upstream and downstream collaboration – then you can certainly understand the importance of systems thinking. Everything is interconnected. Often times by elevating your perspective, pulling yourself up to a higher level, and looking at a problem from a higher dimension with a ‘holistic’ perspective, you can find a better solution.

1.1. Tools should be part of the system

When it comes to the use of tools, we also need to cultivate a sense of system: tools should not be seen as existing in isolation, but should be part of a system. Donald A. Norman writes in Design Psychology:

The only way to address the complexity of services is to treat them as systems and design the total experience as a whole. If each part is designed in isolation, the end result is that the separate parts don’t work well together.

The reason why there needs to be a systems awareness of tools is because real usage scenarios are flexible and fluid. For example:

When gathering information, it is often necessary to record sudden flashes of insight.

In the process of reviewing, it is often necessary to fine-tune some of the fragments and to retrieve and collect them further.

Writing is also not a linear process and usually involves constant switching between tasks: one may need to quickly record sentences that form in one’s mind while reading; or notice a footnote and then retrieve relevant information; or search for paraphrases and communicate with the LLM in dialog in order to gain a deeper understanding of a particular passage. In this process, reading, note-taking, and writing are a rounded whole, not isolated tasks.

Ideally, therefore, the processes should not be isolated from each other, but blended together to provide the necessary assistance on a case-by-case basis. This use of combined tools is a higher dimension of ‘service’ and the output-oriented personal workflow we aspire to build.

1.2 How to develop systemic awareness of tools

So, how should we develop system awareness of tools? My thinking is ‘structuring’ and ‘flow’.

Structuring: The so-called system is an organic whole with a certain structure and function composed of interconnected and interacting elements. This will be further explained in the next section “System → Requirements”.

Processing: A complete workflow should ideally include both inputs and outputs, with inputs reinforcing outputs and outputs forcing inputs, both of which promote each other to form a self-reinforcing closed loop. For knowledge management, input + output ≈ reading + writing. A more general thinking framework is the workflow of the GTD system, which is “Plan – Collect – Process – Execute – Review”. My reading system is built using this framework. Depending on the situation, the actual workflow doesn’t have to strictly follow each step. If there is a problem with the workflow, it is likely that there is an imbalance between inputs and outputs. If there is too little input, the output is hard to sustain. If there is too much input and too little output, this in turn can affect the motivation and ability to input. In a way, this can also be understood as a balance between thinking and acting. In addition, the order of processing is crucial in certain situations.

Workflows built through structuring and flow can be further standardized and automated in the future, and iterated incrementally that make them more attuned to individual habits.

1.3. Before designing a new system ……
Trying out new tools and designing new workflows can be pleasurable in itself, especially when contrasted with the fact that the work that needs to be done with tools is often much more painful. However, our ultimate goal is always output, so don’t tool for the sake of tooling, and sometimes it is necessary to self-restrain this desire to toss tools and optimize only what is necessary.

I would recommend following the Occam’s Razor principle of ‘don’t add entities if you don’t have to’, and specifically, there are two things to consider:

Is it possible for someone else to do it? When I used to visit museums and wanted to photograph the exhibits, I used to spend a lot of time trying to get clear photos without reflections. Once I saw the introduction of the exhibition “The Great Collection of Paintings Throughout the Ages”, in which the photographs of the collection were taken by a professional team, I realized that professional work should be left to professionals. Although it takes only one thousandth of a second to take a picture, it takes an average of 2 to 5 hours to prepare for a photo shoot by these professional teams. Not to mention the dozens of steps required for color correction and other adjustments. Instead of spending this time taking photos, wouldn’t it be better to simply record the name of the exhibit, the author, and other metadata for future retrieval and reuse of professional resources? This is also the practice of “don’t build the wheel repeatedly”, right?

If you need to do it yourself, isn’t it possible to process on an existing system instead of designing a whole new workflow from scratch? For example, a transaction management system is a workflow that many people would like to build in addition to a knowledge management system. However, for most people, the magnitude of the to-do list is not at all worth managing with a specialized transaction management system, and a simple record of daily core tasks is sufficient. Even when I, as a project leader, had to communicate and collaborate with more than ten colleagues in product, R&D, teaching and research, design, and other positions, I didn’t use a specialized transaction management tool, and chose to iterate on an existing note management system. The difficulty of transaction management lies in thinking: disassembling goals, filtering and eliminating them, and creating and designing them over and over again to help us advance our goals one step at a time. Most of the specialized transaction management tools are overly structured. Instead, note-taking tools are not only able to record loose raw information, but also effectively support the structuring of the information, which can then be transformed into executable tasks. This makes note-taking tools ideal for the thinking process. Using a plugin like org-jira, I am able to synchronize structured thinking from my note-taking system to jira, a professional project management tool, to collaborate with team members. This way, I naturally keep my personal thinking paths in my notes library, plus I record the specific actions and results of the project in my notes library, so that when I review the project, I can quickly pinpoint what I was doing and thinking about, and better build on my experience.

2. System → Requirements: There are different types of requirements in a workflow.

Next, we need to discover the endogenous requirements in our own real-world usage scenarios.

Just as projects need to be broken down into tasks, workflows cannot be built overnight. So how should a workflow be broken down? Since workflow is a system composed of elements with a certain structure, we can think of it as a building assembled with Legos. We can first follow the less tightly integrated areas, disassemble them into more independent modules, and then, based on our individual needs, we can We then assemble the tools to suit our individual needs.

In this way, we naturally constructed a high cohesion, low coupling workflow – the functions within each module are closely related, just like each block, with specific functions and shapes, which is high cohesion; each module can be adjusted and improved independently of the other parts, just like the Each module can be adjusted and improved independently of the other parts, just like the blocks can be replaced with each other because of the common connection points, which is low coupling. A workflow with high cohesion and low coupling is exactly the workflow we expect to be able to adapt to changes in individual needs and improve. Compared to “being ready to migrate to a new tool as soon as a new need arises that cannot be met by the current tool”, this approach allows us to find the right interface in the existing workflow and add new building blocks; or locate the module that needs to be upgraded and replace it with a new building block. In this way, it is undoubtedly more helpful for personal accumulation, and avoids the dilemma of switching between various software and getting little in the end.
After dismantling the different needs from the workflow, we can clearly see that the tools corresponding to the different needs differ in terms of both the logic of the software and the focus of the choice. And just as the Pace Layering Theory says, each level of a building and each area of society has its own frequency of change, and different types of tools have different cycles of use. We should be careful with those core, cornerstone tools beforehand, and make the best use of them afterwards, and not easily change them to provide stability for the whole workflow; while some other services can be used without excessive comparison, as long as they meet the needs of the moment, they are considered to be good value for money, and if they are found to be unsuitable in the future, they can also be quickly found to be replaced or upgraded, and optimized on the basis of the existing ones.

Based on the above goals, I use the classic dichotomy in computing: requirements about ” data ” and requirements about ” programs “. More generally, it can also be said that there are requirements about ” documents ” and requirements about ” processes “.

2.1 Data/documentation needs

Perhaps the program class needs are relatively niche, but any digital user certainly has data class needs. I categorize all three of the following as data class needs:
Management: providing the proper view to navigate through a list of files, providing organization and retrieval of files.

Browsing: the browsing of a single file.

Editing: creation or modification of a single file.

With browsing and editing out of the way, let’s look at the core management functionality. File management is so basic that all operating systems come with a file manager, and so varied are the file management needs of users that many of us have looked for something more powerful than the system file manager. Personally, even after years of switching to macOS, I still haven’t found a file manager that fully meets my needs, and as a result, I still run Parallel Desktop and Total Commander to manage my macOS files.

File management tools have been iterated upon as systems have been updated, but no amount of optimization can address the limitations of the system: files can only be placed in a single directory, whereas real-world categorization often requires multiple dimensions. In addition, the management of files in different media types has its own uniqueness, which is difficult to be satisfied by a common file management tool. These limitations have given rise to the development of library management tools. It adds metadata management on top of traditional document management, making the organization and retrieval of documents more flexible and greatly improving efficiency. The next article will further discuss the different management styles: in addition to traditional file management and library management, there is a spectrum of choices between the two; the two are not completely opposed to each other, there are also a number of compatible solutions for different management styles. This will not be discussed here.

2.2 Program/Process Requirements

Program/process requirements are usually for high-level users, and all program requirements are a combination of “what” (what functions to implement) and “how” (how to invoke these functions). Some tools focus only on what, such as script editors, which provide the ability to create and edit scripts, but the invocation of scripts needs to be configured in system settings; some tools focus only on how, such as the crontab command, which focuses on creating and editing scripts, but the invocation of scripts needs to be configured in system settings. Some tools focus only on how, such as the crontab command, which focuses on task scheduling and execution time. More tools combine the two to provide an integrated solution, such as Tasker, n8n, Keyboard Maestro, and various other general automation tools, as well as setup tools that are specific to a particular module (tools for setting the behavior of the mouse and the trackpad on macOS, tools for setting the behavior of the mouse and the trackpad on Android, and tools for setting the behavior of the mouse and the trackpad on Android. trackpad behavior on macOS, various launchers on Android, etc.).

IFTTT’s name, If This Then Trigger That, is a great paradigm for understanding the needs of program classes:
The “what” corresponds to the That part, which defines the functionality to be achieved.
The “how” corresponds to the This part, which specifies the conditions under which the function is triggered.

The difference between different programmatic tools lies in the scope of the “what” and “how” and the way they are configured. For tools that are specifically designed for a certain module, the how is usually built into the logic of the tool and does not need to be configured by the user, who only needs to set the what. For example, tools that set mouse behavior are triggered by default when there is a mouse operation, and various launchers on Android are triggered by default when they return to the home screen. Generic automation tools, on the other hand, require the user to configure what and how.

IFTTT is arguably the simplest general automation tool, where you manually select how and what from the range of services supported by the tool.

2.3. categorization is meant to aid thinking, not to set boundaries

In reality, the relationship between program and data is ambiguous. On the one hand, Gödel’s transformation of number-theoretic propositions into codes in proving the incompleteness theorem echoes the idea of ‘program as data’; on the other hand, the understanding of mathematics as computation, e.g., the distinction between the process of calculating a PI and the result of calculating it, led to the center of computation, the biggest ideological turn of the 20th century in science.

In tools, we can also see that the above categorization is not clear-cut:

File management tools mostly integrate file browsing, providing a preview of a single file while browsing a list of files. Imagine how difficult it would be to use an image management tool without an image preview, or how convenient it would be to preview a file without opening it when browsing a list of profiles.

Some browsing tools also include simple editing features, such as rotating and cropping in the image viewer and editing the table of contents in the PDF viewer.

There are also some editing features are very structured, there is no need to open the file one by one, for a special treatment. This kind of batch editing is better suited for integration into management tools. For example, in calibre to batch convert book files into formats supported by the reader;

Some powerful software even breaks down the distinction between data and programs. The previous section demonstrated how flexible Emacs is for programmatic needs, which I need to implement because I also use Emacs to manage my notes, code projects, and so on. Many file management tools support plug-in extensions that allow for a high degree of customization of the tool’s own programs in order to adapt to the diverse and individual needs of the user.

This happens precisely because the tool is designed to deal with the problem of “as the number of tools increases, so does the friction of using them,”. Which features are integrated into the tool (addition) and which features are beyond the tool’s boundaries (subtraction) depends on the product’s control. A professional product (this point was also discussed in the previous article “How to judge the future development trend of a software”) can satisfy both the internal functionality and external compatibility of the tool, and find the optimal balance, instead of just doing addition to meet the needs of the user, or subtraction to satisfy the aesthetics of the individual.

Thus, the distinction between program and data is made not because there really is such a clear boundary, but because this categorization can be a thinking tool for us to understand the logic of the application. Understanding the category to which it belongs allows us to better:

understand the tools. Different types of tools have different intrinsic paradigms, and choosing the right one can help us clarify the tool’s position faster and get through the novice break-in period. Discussions comparing different types of tools are often inefficient, but this happens all the time. For example, the statement “calibre is a poor reading experience, so I went back to iBooks” compares calibre to iBooks, which is like saying “Zotero is a poor reading experience, so I went with PDF Expert,” which compares management tools (calibre and Zotero) to viewing tools (iBooks and PDF Expert). ). The core function of the management tool is to organize and retrieve a large number of documents, browsing is just an auxiliary function to facilitate management, certainly not as good as the experience of a dedicated browsing tool. If you only need to browse individual documents, you should go for the right browsing tool, choosing a management tool instead introduces additional complexity.

Estimate the time worth investing for this. As mentioned earlier, different types of tools have different life cycles, so the amount of time worth investing in the tool should be different, whether it’s for pre-research or post-learning. The longer the tool’s lifecycle, the more pivotal it is in the workflow and the more time and effort it is worth investing. A change of course for the core tool will most likely involve issues such as core data migration, leading to short-term failure of the workflow. There are just two categories of pivotal tools that I think deserve the most attention: ‘management in data’ and ‘general automation in programs’. Other file editing, browsing, or setting tools for individual modules, relatively speaking, do not need to invest too much effort, to meet the current needs can be, in the future, if you find that it is not appropriate, replace it with other programs on the overall workflow will not have much impact. The core application here is actually management tools: book management tools and note management tools. In addition to the tool’s use cycle, the frequency of tool use is also an important consideration in assessing learning costs. If a tool is used for a long period of time, the frequency of use is also high, then it is definitely a tool worth spending time to learn, typically represented by note management: most of the other types of management tools are just an entry point, the real browsing and editing or to jump to the corresponding tool; and note management tools are often integrated management, browsing and editing functions, so many of the friends who carry out the management of notes notes notes As note management tools tend to integrate management and editing functions, many note takers use them in the background for several hours a day; moreover, as we take notes not only for the present, but also for the future, we expect to generate compounding benefits (e.g., inspiring inspiration, accelerating outputs, providing decision-making support, etc.) from our ever-increasing number of notes, so note taking tools have a very long lifecycle.

How tools in a design workflow collaborate with each other. We will then describe how the tool chain should be split and combined in order to build a sustainable workflow, with an important reference point being the type of tool.

Once you understand the tools, you can more accurately determine whether they are suitable for you by combining them with your own needs and capabilities.

For example, if we need an automation tool, we can quickly understand the core of the tool by applying the thinking framework of “If This Then Trigger That” to understand the “what” (what functions are realized) and “how” (how to invoke these functions) it supports. Since general-purpose automation is a pivotal tool, there is a mental expectation that it will take some time to learn and configure. There are three forms of configuration of automation tools: manual selection, manual assembly, and writing code. Depending on your needs and capabilities, choose from these the most appropriate form for the moment. If the future needs are not satisfied, you can also clarify what the upgrade and optimization direction is.

Published by Tony Shepherd & last updated on April 13, 2024 8:23 am

Comments are closed.