HOLISTIC Research: Deductive and Inductive logical thinking supported by Lateral/Creative incursions

Conscious thinking is largely a sequential process.

This simple chain of reasoning forms the crux of what is known as DEDUCTIVE LOGIC.

The skies were crimson red. The birds were retiring to their nests. The shadows grew longer as weary watchmakers found the final session the hardest. The morning had begun with a slice of creative insight and enthusiasm, but sustaining this brilliance throughout the work-space-time was an extremely difficult task. Tired legs and weary minds of the watchmakers forced an inevitable temporary diurnal retirement....

As the great Aristotle once said:

You are what you repeatedly do. Excellence is not an act but a habit. 

Note here that the word "watchmaker" used here is CRYPTIC. While the deduction of the context in which this paragraph was written is obvious, interpretations of some key PSEUDONYMS such as the "WATCHMAKER" and "work-space-time", require imaginative thinking (as the injections are almost poetic).

DEDUCTION of context: This simple paragraph begins with the description of a typical seasonal EVENING. Taking the word "WATCHMAKER" in the literal sense does not alter the broader context significantly. The two statements .....shadows grew longer....weary watchmakers found the final session the hardest... The morning had begun....sustaining this brilliance... difficult task. in a way discuss a typical work-force efficiency curve widely applicable to all walks of life: students, teachers, industrialists etc... All of us begin the day with a BANG, overflowing with creativity and passion and finish the day with a quiet reflective cup of tea. The CYCLES of CREATION and REFLECTION alternate not just on a regular DIURNAL SCALE, but also in smaller scales. The work-space-efficiency curve is not just a simple staggered exponential but when one zooms into a segment of this curve, one may observe an almost fractal like behaviour. The mid-way singularity comes from the injection of the so called LUNCH TIME. This extension and expansion is via simple deduction and based on the premise that the attention span of any individual on a specific sub-topic does not last more than a few minutes.

Interpretation of the term WATCHMAKER: The term watchmaker here refers to a fundamental thinker (to a large extent a sequential thinker who finds a way to build simple premises for arguments). Deductive logic is not very difficult, but construction of premises demand a different form of thinking which we can term as what is known as LATERAL THINKING. Premises are essentially levers which permit the thinker to elevate one's thought process and most importantly allow one to JUMP the GUN...The construction of the fractal extension would not have been possible without the deviant premise pertaining to the attention span of an individual. 

In a broader context, creative thinking leads to what are known as expansions. Seeds sown in the morning thanks to an enthusiastic and passionate mind will eventually sprout over the day. The stitching of ideas is done by the rational mind in two different ways:

1) Inductive reasoning: Throwing open a conjecture or a rule and validating this at different scales and over multiple sets of observations. Finally when the rule is shown to be valid for a fairly large set of observations, an attempt is made to generalize it. In case the rule breaks down for some observations, logical adaptations are allowed provided this does not influence the overall generality and compactness of the rule significantly. The conjecture is the idea which is built on certain premises and invariably stems from creative and holistic thinking. The flow diagram here is:

Idea (Conjecture or Hypothesis or Rule) ---> Observation induction ---> Rule adaptation if necessary ---> Generalization

2)  Deductive reasoning: In some cases the ideas are not exactly conjectures but interesting observation points. A certain pattern not seen earlier is witnessed which somehow uniquely describes a certain phenomenon. This phenomenon is sampled through experimental observations and then a fundamental rule is deduced which captures this behaviour holistically. The flow diagram here is:

Experiments ---> Measurements/Observations ---> Interesting pattern identification ---> Rule deduction ---> Rule verification and Justification through physical linkages with the actual phenomenon ---> Rule testing on fresh data ---> Rule adaptation is necessary (preserving compactness) ---> Final testing

The creative venture here is in spotting a certain interesting pattern in the observation set and then associating this with the actual phenomenon being witnessed, i.e. answering the question WHY this ANOMALOUS behaviour?

Inductive reasoning is the harder of the two because one has to begin with an excellent HYPOTHESIS or a RULE and is a practice usually followed by purists and theoreticians.

Features of inductive reasoning:

a) Arriving at a rule in the first place which encompasses the overall systemic behaviour or describing the phenomenon.

b) Adaptation phase, in case some of the observations do not fit the rule. Which ones and why? Is there a common thread between them?  Can the rule be adapted to inculcate these outliers, without violating the compactness of the rule itself? This slowly turns into a iterative tuning and optimization problem.

c) Generalization with rigorous testing.

Features of deductive reasoning:

a) Setting up experiments.

b)  Understanding the vagaries of the experimental setup and the variances involved in various measurements.

c) Spotting the anomalies pertaining to a specific phenomenon.

d) Formulating a rule which captures this anomaly and then providing a physical explanation for the ANOMALY.

e) Generalization with rigorous testing.

Non-exclusivity:  The main distinguishing factor between the two LOGICAL approaches, is the INITIAL HOLISTIC UNDERSTANDING of either the phenomenon or some systemic behaviour (i.e. prior knowledge) absolutely necessary for an INDUCTIVE approach, while this understanding comes later or may sometimes be even accidental for DEDUCTIVE approaches.  The interesting factor that needs to be understood is, WHAT triggers one of these two MODES (or directions of thought)? Are these two BINS/ MODES exhaustive? The answer to the latter is obviously NO as no research thesis can be typically type-cast into one of these two BINS. Sometimes a research thesis may have MULTIPLE PROBLEMS/SUB-PROBLEMS each of which is of a different type (INDUCTIVE or DEDUCTIVE) and often it is a combination that gives a partial solution to the overall problem.

A viable example of this form of twisted complexity was my own thesis:

Kannan Karthik, "Methodologies for Access Control and Fingerprinting of Multimedia", DOCTORAL THESIS, University of Toronto, 2006.

The thesis had three main fronts or problems:

1) The Joint Fingerprinting and Decryption formulation inspired by the CHAMELEON cipher of Ross Anderson, which attempted to combine seemingly orthogonal processes of WATERMARKING/FINGERPRINTING and DECRYPTION in the COMPRESSED DOMAIN.

The work towards this front/MODE was more INDUCTIVE and less DEDUCTIVE and began with a series of formulations. The INDUCTIVE segment attempted to MAP this JOINT PHENOMENON as a FINGERPRINT KEY CARRIER PROBLEM. It was hypothesized initially that a contortion between the encryption and decryption keys produced this interesting fingerprint at the receiver. [Published both at the SPIE conference (2004) and IEEE Proceedings (2004)]. Construction in coefficient shuffling spaces and leaving sign signatures intact was one of the first few example constructions DEDUCED in this framework and published in IEEE Proceedings (2004).

The INDUCTIVE segment also predicted the integration of CODES such as FRAMEPROOF codes in the design of the keys (which by now were treated as WATERMARK CARRIERS). This opened up two issues: Collusions in the KEY DOMAIN itself and Collusions in the SIGNAL DOMAIN (compressed or uncompressed domain of the host video or image): the former of-course being much more destructive than the latter.

Following this, it was later DEDUCED that by first splitting the KEYS into seamlessly mixed SYMMETRIC and ASYMMETRIC segments and incorporating ANTI-COLLUSION CODING METHODOLOGIES in the ASYMMETRIC segment, it was possible to embed FINGERPRINTS at the time of decryption. The strength of the fingerprint was controlled by regulating the correlation between the encryption and decryption keys - a judicious trade-off between DECRYPTION KEY SIMILARITY and FINGERPRINT STRENGTH/ROBUSTNESS.

The analytical model proposed in this thesis which used simple SHANNON measures of quantifying INFORMATION, predicted that if the decryption keys were made completely independent (which means the carriers were made completely different), it is both theoretically and practically impossible to implement a strong encryption algorithm. The uncertainty in the cipher space will become comparable to the un-certainty in the watermark/fingerprint space and subsequently the encryption operation will become an identity function. Details of this amazing ANALYTICAL FRAME (albeit simplistic), was published in my book with Hatzinakos [Mutltimedia Encoding for Access Control with Traitor Tracing, VDM Verlag 2008/2012]. On the other hand if the decryption keys were made similar to each other, the fingerprint space would shrink, but the encryption algorithm can be made much stronger. In the extreme case when all decryption keys become one, then the encryption algorithm can be any arbitrary strong encryption algorithm which exhibits sufficient strength in confusion and diffusion.  

It is the line between the two extremes that opens up this JFD framework to a gamut of soft-encryption algorithms.

2) Design of an Anti-collusion coding methodology to combat linear collusions in the signal domain. This was an interesting sub-problem which was siphoned off the "main course", with a view to overcome the problem associated with the AND-ACC proposed by Wade Trappe et al. (2002). 

The solution came by first observing that SIGN MODULATED fingerprints (which are anti-podal), when averaged translate to a MAJORITY BIT VOTE of FINGERPRINT CODEWORDS. Subsequently a methodology was developed to generate a compact fingerprint code based on what can be called SURVIVAL PATTERNS of SEMI-FRAGILE fingerprint MARKS. This was the first time sequential and dynamic fingerprint mark insertion was introduced. Each MARK which was a semi-fragile watermark was shared across a set of users in such a way that if a majority of these averaged their copies, this MARK would survive. Another ORTHOGONAL MARK is then introduced for a different subset of users. By optimizing the set differences between user-sets over successive MARK insertions, a compact code can be generated. This idea was discussed in brief in the thesis and then later published in detail in DHIMS Manchester (2007).

3) MIX-SPLIT: The last problem was an icing on the cake and became one of the most beautiful formulations. It opened up a pandora's box of papers ranging from non-perfect secret sharing schemes for fine-grained access control to finding applications in intelligent key dissemination in wireless re-configurable networks. The problem began with some simple questions:

a) Is it possible to combine access control with traitor tracing?

b) Perfect access control implies generation of WHITE shares (or shares which behave as white noise when individually examined). Only when legitimate access sets are examined do the combinations reveal something completely deterministic (called a SECRET or sometimes multiple SECRETS). On the other hand traitor tracing entails creation of associations between data sets. WHITENESS on one hand for perfect secrecy and CLOSED associations on the other for traitor tracing. What is this interesting MIDDLE GROUND?

c) A simple non-perfect secret construction based on a MAJORITY and MINORITY vote called MIX-SPLIT was developed as proof of concept and to illustrate the tradeoffs involved in this non-perfect multi-secret sharing scenario. Several potential applications were conjectured.

The final problem related to this MIX-SPLIT is a typical example of an ITERATIVE INDUCTIVE conjecture followed by a DEDUCTIVE simplification or abstraction.

Problem-1: Largely INDUCTIVE: Formulation first and then the mathematical model followed by abstractions and examples.

Problem-2: DEDUCTIVE: Observation first and then the mathematical frame followed by experimentation.

Problem-3: Iterative INDUCTIVELY DEDUCTIVE: Formulation and mathematical model inseparable each one nourishing the other.

The importance of Lateral Incursions in a CLASSICAL thesis

While the RATIONAL MIND forms the arms and feet of a thriving human body, the CREATIVE MIND is in fact its SOUL and TRUE EYE.

While DEDUCTIVE GROWTH is a sequential process, inductive expansion is largely done through what is known as PERFORATION. There comes a point once when the problem has been identified, that the thinker is expected to meditate on various facets related to this problem. A larger and more complex problem demands a multi-dimensional thought process. Views and counter-views, propositions and counter-propositions permeate the initial formulation. It is by virtue of this initial internal friction created by the thinker himself, that this PERFORATION is effected. One must always remember that the counter-proposition is usually a lateral thought not emerging in the same plane in which the proposition was constructed. Reactive propositions are not considered counter-propositions and do not lead to any form of PERFORATION. The process of INDUCTING LATERAL THOUGHTS to build a ladder of ideas at the very outset of the thesis, is an excellent and welcome process. The creative and rational mind must be allowed to dance freely and touch various aspects of this problem. The greater the perforation, larger will be the COCOON which encompasses this beautiful problem. It is then left to DEDUCTIVE LOGIC supported by a ADAPTIVE INDUCTIVE  reasoning to coalesce each of these growing and thriving lateral strands. 

The notion of seeing the WHOLE from fragments; of seeing form and shape in DOTS; and finally the process of seeing an unfinished sculpture represents what can called HOLISTIC thinking. The human mind has an uncanny ability to connect the micro-dots, yet withdraw and witness the position of this scribble in the grand scheme of things. The floating small and the SWARM whole all have a role to play in the final thesis.

 

Ref: http://www.shrink4men.com/wp-content/uploads/2011/12/surreal-man-with-hands-covering-face.jpg

Rational thinking is a constrained HORSE-RIDE. The road has limited distractions, but the amount of pleasure that one derives from this ride is limited by the depth of insight. But real and quick progress is facilitated not by HORSE-RIDES but through what are known as lateral JUMPS. This process has been illustrated by this simple optical illusion shown above.

Any "well grounded" CLASSICAL thesis has the following stages, although it may hard to separate the stages easily. Very often the process which begins with the search for a multi-dimensional problem and the final general frame which provides an umbrella for the idea is a CONTINUUM. Compartmentalizing the process into rational extensions and CREATIVE and improvisational spurts is ill-advised for the walk is SINGLE and belongs to the INDIVIDUAL. The dance is internal and the boundaries are internal. Even the blocks created in the process of inquiry are internal. I am sure the great Swami Vivekanand would agree when we say:

I am the world and this world is nothing more than a figment of my own imagination.

Every problem is confined only to my layers of ignorance. Every solution however has its roots in the infinite wisdom of my own SOUL.

You are interestingly everywhere but here. You only have to stop for a while and look within.

Any CLASSICAL CONVENTIONAL thesis will have the following elements:

Stage-1: FRAME search and identification: Entails searching for an UN-RESOLVED FRAME. In this phase it is not the problem that is identified but it is the FRAME itself. The identified frame must be MULTI-DIMENSIONAL and must have sufficient complexity to tickle the mind of the avid thinker. If the frame is new, all sub-problems developed and synthesized within this umbrella will also be distinct and unique.

Stage-2: FRAME PERFORATION and expansion:  Once a NEW FRAME is detected, the next stage is identify the BASIC essentials (or the research VOCABULARY) which can be used to COMPOSE the FRAME.  This process is called FRAME FILLING and is extremely important as it decides the APPROXIMATE GEOMETRY and the DIMENSIONALITY of the PROBLEM SPACE. It is at the end of this process that this FRAME assumes a VAGUE form but teeming with LIFE inside it. Every seed inside is a potential sub-problem, a miniature series of micro-dots that connects with the WHOLE at some point of time when the understanding has reached a certain level of maturity.

Stage-3: Mathematical BOUNDS and LIMITS of the FRAME: The "form" mentioned in the previous stage becomes a lucid shape, only when the boundaries are known. What are the LIMITS of this frame? This is precisely the question that is answered in the MATHEMATICAL FORMULATION. 

Stage-4: Abstractions within the defined FRAME: Once the limits of the FRAME are defined, a seed or element within this GEOMETRIC FRAME is picked and expanded. The so called ORGANS within this FRAME are grown and nourished. Each ORGAN is the outcome of the solution to one of the myriad sub-problems encased inside this frame.  This is the MIDDLE GROUND and the main publication phase of a CLASSICAL thesis.

Stage-5: Reflective inquiry, STITCHING and OBJECT ENCASING: Eventually when the problem has acquired a certain maturity, it is time for the researcher to meditate on the entire work, reflect, quietly analyze and STITCH together the complete treatise. There are several ways in which this stage can be understood: One way to view this is through OBJECT ENCASING. Stage-3 has provided the limits or the boundaries. Stage-4 has provided life within these boundaries. So Stage-5 essentially involves a harmonious union of all interesting ideas within the FRAME. This procedure of forming a UNION is called  OBJECT ENCASING - OBJECT because there is RIGID GLOBAL FORM; ENCASING because it is inscribed in the original FRAME.

Stage-6: Further expansion and SIDE PROBLEMS: Stage-5 is virtually a pseudo-closure. However any fertile mind will spot and would have spotted several lateral extensions to the same problem. Addressing some of these FRINGE problems is also a CRUCIAL part of any thesis. This part need not necessarily be completed within the stipulated research time-frame. But sometimes this little LATERAL EXCURSION sows some of the finest seeds in research.

Stage-7: APPLICATION centric perspectives: The work ends or rather begins with a series of application sketches. These are what can be called SMART TOYS, whose soul comes from the FRAMES developed through stages 1 to 6.

You toss a coin, you become a creative thinker.

You take a long walk, you become a deep thinker.

You juggle with pebbles, you become a rationalist.

Finally, when you smile at everything, you become COMPLETE.

But then, you were complete, you are complete and you will continue to be complete. This is the CORE OF VEDANTA.

-- Any queries please direct it to Kannan Karthik (k.karthik[at]iitg.ernet.in or kannan.karthik[at]gmail.com)