29 March 2014

BST By The Hour

BST Clock changes every few months is quite deceptive. It is a good thing that so many systems run on epoch time to avoid the constant adjusting of time clocks as it could become a nightmare especially if one is using the cloud environment where data could be distributed across multiple networks situated in different time zones. Adjusting the clock is not a good thing even if it benefits an extra hour of daylight or an extra hour of nighttime. We really should keep the changes in perspective given the fact why we receive more daylight or nighttime in first place. Trying to change times is artificial. Although, the notion of time is real. One needs to adjust the life to nature not adjust nature to the way of life.

History of British Time

Alternatives To Hadoop

Hadoop has become the industry standard for batch processing in the large especially for big data applications that require hefty amount of machine learning over a large collection. MapReduce being the center of a lot of such processing requirements as well as taking advantage of the sub-modules of the platform to facilitate various specialized tasks within a pipeline or workflow. The core aspects to Hadoop mainly incorporate the storage of files and the processing of such files.  In process, each stage is loosely abstracted out within the sub-projects to resolve various needs. There are also various open source alternatives available instead of using Hadoop or even in conjunction to it. Especially, for real-time needs where Hadoop often does not fit well or where one's needs go well beyond the demands of which HDFS is able to meet. There might also be situations where one's needs are just not as demanding enough to justify using Hadoop which would often just make it overkill on complexity.

Cloud Automation

Managing infrastructure for development teams can be a real chore especially when there are thousands and thousands of instances to manage and a multitude of applications to serve. Automation is often the answer for many as a miracle in saving countless man hours of manual configuration and deployment work. As a result of the cloud, there have been many software automation tools that have come about to help the developer in understanding their operations environment better and making the approach more accessible as well. SaltStack, Fabric, Puppet, Ansible, Cloudify, Capistrano, and Chef are few options. Perhaps, the most interesting option here is the SaltStack which provides for both orchestration, remote execution, configuration management, and a whole host of other features. It embodies the answer for the cloud automation in so many countless ways and yet trying to keep things as simple as possible.

27 March 2014

Quora

Quora is a site where questions and answers are collaboratively discussed and modified through a community of users. Essentially a social process. However, this social process is flawed and at times opinions are taken personally from users which means the site lacks integrity in respecting every ones' opinions and not of specific few. In that process, it discriminates on opinions for answers to questions. Which in a lot ways diminishings the whole collaborative and social aspect of the process. The model is flawed and the answers will often not be objective. It also means people will get into heated discussions when they take things too personally which can be derived as anti-social behavior by personal attacks on individuals. The moderation process is also very discriminating and perhaps can also be viewed as bigoted as their opinions on evaluation are also surmised in discriminating users. It seems in all intents and purposes if one cannot handle a variety of opinions then perhaps it is not the best place to be as one will often end up taking things too personally. There is a rather big social issue on the internet that arises from people taking the aspect of comments seriously losing the perspective and context of reality. It may even be taken as the fact that an opinion is every ones essential right whether it is agreeable or not. However, what turns into anti-social behavior is when personal attacks are taken on which is not very conducive to the openness of the social web on the basis of which the internet has been so successful. Perhaps, the view of social media and networks needs to be taken with a more open view by users when using such sites knowing that there will often be views that they may not agree with but having the sense of respect to understand that others have a right to hold their own opinions whether right or wrong. This is one failing factor of social web where people blur the lines between reality and internet and often end up taking things too personally. It is also perhaps why there needs to be a certain awareness of such issues as well as understanding that the global space like the internet necessarily will have variety of content not necessarily agreeable to all. Such sites also display a very significant level of bigotry in the way they define terms of service through discrimination and the way content and users are moderated. It often leaves one with a bad taste when such sites blur the perspectives and lose the whole defining objective of an open social web. Maybe, one way to avoid such sites is to not support their use until they are able to provide an equal and open approach for the way they handle users and content. In general, the collaborative approaches are not effective towards objectivity. They are also not efficient for establishing constructive and accurate ways towards intelligent solutions. For open question and answering as well as recommendations the approaches need to utilize semi-automated techniques to provide for better alternatives. They even could use a pure automated approach but that avoid social collaboration entirely. In a manner of speaking, social collaboration is only as good as when it enriches or adds value to an algorithm as part of human assistance, but they should be necessary in guiding an intelligent system towards a logical conclusion. Quora as a social platform does not work nor does it provide accurate answers. But, what it does do is provide variety of stagnating inputs in form of collaborative insightful answers which can be classed as opinions with very little added intelligence - not a very smart solution to an uncomplicated problem. Alternatively, they could provide an open question/answer search function alongside the community of answers that would add richer contextual value towards building a more natural semantics as well as discoverable analytics. An ontology could increase in harnessing more semantics as well. The natural step, in process, would be to spontaneously link the social web of discussions via linked data through which an evolving graph emerges.  

26 March 2014

Java Mac Setup

Useful blog on basics of setting up multiple versions of Java on OSX. Even jenv is pretty good.

Semantic Annotations

Semantic annotations is a broad and complex area often requiring a mixture of natural language processing as well as knowledge representation. One of the major inherent requirements in an application is to provide for word sense disambiguation. There are also more light weight approaches that generalize on the semantics alone in form of ontologies especially for maintaining publications and cataloging. Such semantics can cater for both text as well as multimedia. What this enables is that semantic labels can be constructed in context and provided for findability, better visualization, reasoning over a set of web resources, and allowing for the conversion from syntactic to knowledge structures. One can approach this manually or in an automated fashion. The manual step often takes the typical approach of transforming of syntactic resources into interlinks of knowledge without taking account of much in way of multiple perspectives of data sources, and which is applied using third-party tools. There is also the approach of utilizing semi-automated annotations. Even though, they also require human intervention at various phases of the process.  GATE is one such semi-automated tool for extracting entity sets. Automated approaches usually require tuning and re-tuning after training. They can get their knowledge from the web and apply it to content in a context-driven manner for automatic extraction and annotation. Wrappers are created that can identify and recognize patterns in text for annotations. While at times, they may be human assisted. They may approach using various classifiers as a supervised way of learning patterns. For annotation of multimedia, this often takes the approach of rich metadata. Alternatively, it could be more in way of content semantics or even granular to the multimedia. Annotations could be global, collaborative, and even local. One could extend and provide rich annotations using custom metadata that could be variously defined through controlled vocabularies, taxonomies, ontologies, topic maps, and thesauri for different contexts. There is even a W3C effort for open annotations as well as the LRMI effort based on schema.org as a learning resources initiative. One could even build a pipeline approach through the various workflow stages of filtering process for content using UIMA. And, even as a CMS approach similar to Apache Stanbol. Standard tools like Tika, Solr, OpenNLP, Kea, can also be useful. Often languages like Java, Groovy, Python, XML, RDF, OWL, are used for implementations and rich textual semantics. However, increasingly tools are emerging on Scala as well.

Smart Fraud Management

Fraud management is a real issue on the web. However, not just on the web but quite pervasive in our society which takes on many forms. However, many systems are not smart enough these days to detect fraud. They are also limiting in their capacities to discriminate. Unfortunately, people often seem to take the misguided approach of trying to detect fraud in all the wrong ways especially when they attempt to discriminate on basis of racial profiling. This often adds to overfitting or underfitting of the data. Sanctions databases are also centrally managed which are at times not sufficient enough to be kept up-to-date in real-time nor can they be fully trusted for accuracy. Even credit reference agencies are not always sufficiently up-to-date with their records, in real-time, and at times can make mistakes which can effect individuals for years. How do you protect people effectively that make transactions online and take the necessary steps to protect their identities. Banks are very incompetent in protecting customers from identity fraud. Intelligent methods these days need to do more to tackle such attempts. If someone loses their life savings through fraud it should be realistically possible to get the money back, all of it, and the systems corrected for future preventative measures. However, to a certain degree this is possible today in certain regions. Although, this is a security issue in many respects, it is also a compliance issue. Fraud really needs to be viewed from an objective sense without effecting the data with subjective human stereotypes. That is the only way fraud can be more effectively detected and prevented. Also, by just detecting fraud on basis of a boolean yes or no is insufficient. Even the idea of presumptuous scoring is insufficient as it is all very subjective. Intelligent methods against fraud need to take a holistic approach. Such systems need to be embedded with what ethics really means. They need to be embedded with notion of regulation and compliance based on real-time updates. The systems could monitor transaction flows in a fully encrypted fashion. The decentralization of fraud management is good because even such services can come under disrepute. Often, fraud management and providers of such management required to maintain audits and compliance in order to provide high levels of trust. But, how often do we find carelessness with data management. While customers are victimized online, fraudsters continue to find better evasion methods and at times are coy enough to continue the same practices undetected. This does indicate that fraud management in many services especially for the web is not working and has not reached a significant crossroads of improvement. Although, it does seem valid that machine learning is the answer to most such fraud management approaches. And, detecting patterns and learning from such patterns certainly helps in targeting and retargeting. However, a richer sense of semantics is necessary as to the meaning of fraud as well as the meaning of trust. It is also necessary to provide a more secure form of distributed identity management where people could check their rolling identities and be more mobile with them, trust in the knowledge that it is intact from preying eyes. Unfortunately, even the notion of securing systems these days is rather a shady area, as there are always backdoors necessary in order to provide checks and balances especially for legal and regulatory which can inevitably provide for leaks. Intelligent systems with semantic web is the answer to most things of today. Using both statistical techniques as well as artificial intelligence that can be framed in a rich knowledge representation is a workable approach. Also, once the knowledge representation is there a more likely knowledge discovery graph can be anticipated. Most such solutions can be tracked using a graph database to understand the familial and detect richer patterns from even assisted metadata. Even approaches of applying deep learning by way of neural networks can be a positive step. 

25 March 2014

Smart Web Security

Internet is such an open community with a global user base. Unfortunately, trust is a major issue. Web security over decades has become a separate mainstream business on its own. However, to keep up with the pace on updates and new malware is a constant concern not only to users but also for software providers. Inevitably, hackers are able to prune out backdoors into all sorts of software to do harm. And, there just is not enough being done to make it a secure and safe interconnected network of places for users. In the end the integrity of the internet as well as service providers suffers who at times can suffer downtime as a result. There are also privacy issues around the way data is being handled. Perhaps, all this boils down to what is sufficient security and whether it is robust enough. There are also separate issues here in regards to preventative steps as well as actionable software to fix issues once havoc is wreaked through a system. There are a multitude of new alternatives in comparison to the standard ones that most security software is built on. For one, they need to harness better encryption of files. There also needs to be a more distributed security model that facilitates a way to bolster a stronger penetration in an evolvable way. In this respect, natural computation algorithms that provide for global optimization are one aspect of the answer. But, also to utilize multiagents to deductively respond to attacks on a system by making the security even stronger in a manner of adapting and learning per each iteration. Perhaps, even aspects of reinforcement learning would be another way. There is a lot that can be done from utilizing machine learning algorithms. In a way, the security system would first have to detect a penetration, then to respond to the penetration by evolving the system, and then to learn based on patterns of attacks. It would make it extremely difficult for a hacker to detect an evolvable and random change in patterns especially when the system is always a step ahead. Learning based on security patterns is relatively good way to bolster a stronger security model and then to erratically mutate and crossover to add even more variety. So, to optimize the solution would involve aspects of evolutionary computation which would entail adding calculated randomness to the patterns based on some fitness function. Even aspects of utilizing swarm intelligence could work to identify an attack by clustering on as a food source and then to take the appropriate steps to resolve like swarming army of bots. For most artificial intelligence problems and especially for data mining in general, there is usually a set of inputs and a set of desired outputs which can feed into each other on a pipeline. Thus, it can be inferred that most applications of artificial intelligence involve a black box where most of the processing is carried out against a set of inputs and outputs. In a similar manner, for security the inputs can be the penetration attacks on a system, the process can adapt, and adjust to provide for better output of instructions as to what counter steps to take based on a feedback loop. In process, making the system evolve in a form of a game play. For web security, adaptability is probably the best approach as it can also be unique in patterns to each system. Not every system requires the same level of security measures and audits. Web browsers also need to be pushed towards more intelligent means of security as pluggable options which essentially act as clients on a client-server model. Unfortunately, there will always be people looking for backdoors to target ethical users on the internet who may be all too innocuous. Linked data and semantic web could also entail a solution especially for added knowledge representation. Standard systems of today really need to start taking inspiration from artificial intelligence in order to make systems smarter that can be both future proof and make our lives easier as well as more pervasive. 

24 March 2014

Heroku

Heroku is another useful platform as a service offering built using AWS. Although, it uses a lot of ruby and essentially that is what it started with on the platform. But, it has steadily grown to support more add-ons and programming languages. It also has a very quick build/deployment approach compared to Google App Engine as well as a tight integration with Git. Setting things up on the platform is also very easy. And, free tier is unlimited to domains but has usage limits. The main drawback to the platform is that it can get very expensive very quickly once the data storage limits start being used with the add-ons and attaching additional for scalability. Perhaps, one has to weigh it out between whether to use Heroku, AWS, or both.

Google App Engine

Google App Engine is a useful platform as a service offering. However, it lacks many of the accessible options for the standard developer. The SDK services require a bit of fiddling around to make the integration work. Although, this has improved over time. This is in comparison to the seamless offerings with AWS and Heroku. It is also the case that language support is lacking where one can only develop in either Java, Python, and Go. Even the standard relational option has only just recently been introduced. Custom domains require a Google Apps account which is also limiting as a service compared to the flexibility offered on other platform services. As yet, many developers are looking to use Google Apple Engine as a point of experimentation building prototypical applications. Another obstacle with the platform is that it is not portable. The development of a project has to be custom done to work on the platform and cannot later be transferred to another PaaS provider without much re-engineering. Although, third-party solutions are slowly emerging to tackle this as a misconception. Google App Engine over time also works out more expensive than AWS which has become pretty much the standard choice of cloud based development in the large. Although, to be fair, AWS is more of an infrastructure as a service provider. However, in order to compete with other PaaS providers, Google would have to make their cloud services more approachable for developers as well as providing options for use of custom domains outside of premium pricing. It is not to say that custom domains cannot be used without premium access. But, it still requires much tweeking around to get such things to work. With AWS one at least gets a free tier for a year to plan for a full launch as well as a lot of flexibility on the cloud services. And, even Heroku provides a useful free option. It may also be more realistic to compare it against Heroku as they are essentially both PaaS providers. Google App Engine, is however, useful for Python and Java developers. But, for Java it also has quite a few limitations and restrictions to which libraries that can be used and accessed. A lot of core services on the platform work quite well if one is building a Python based application as updates and new changes are available first for that language. There is still much for Google App Engine in the evolution towards becoming more accessible in industry and to grow out their cloud services model. But, one thing is for sure that if one does deploy on to their platform they will have a good uptime of services to work with in order to handle high levels of data loads. Nevertheless, it is still a pretty solid cloud platform for building scalable applications. There is an obvious and conscious choice that one has to make when choosing the right cloud provider whether for prototype projects or for production use depending a lot on the evolving application needs.

Smart Utility Services

Utility services are getting more and more expensive across every household. Energy in all its forms as a consumable is becoming more and more expensive. Unfortunately, as technology driven products and services become all the more a necessity the need for utilities is also becoming more and more an important dependency for all. Winters are growing colder while Summers are getting hotter. It seems only natural that something must be done about the way utility services are consumed and how they are conserved. Building more intelligent meters would be useful. Also, the idea of being able to conserve around the house when such utilities are not being used would be an adequate functionality to have on electrics. They do exist on most standard PCs nowadays but not on most electrical items. Robotics could also come in handy around the house for utility consumption as well as in creating options for better fuel and energy sources that can be better renewed in turn saving many in the huge costs to a supplier. Often the expense passed to a consumer is down to the supplier being greedy and stock piling in order to inflate price which has a result on the way they may be traded on the financial markets. There has to be better alternatives possible that can leverage cost savings across the board. These can come from more hybrid approaches from semantic models across cities and using artificial intelligence techniques to harness better alternative fuel sources that can drive more efficiency as well as meet the practical demands.

23 March 2014

Simply Bootstrap

Bootstrap is a pretty amazing framework or perhaps a frontend toolkit. It does everything desired of a fast moving agile development team and facilitates an immense amount of standard approaches. It also makes web design for a developer fun as they do not have to spend hours and hours tweeking on something while trying also stay functional in their development work. Bootstrap essentially saves time in developing simple and sleek layouts. It is also fully customizable in its approach to web design using standard HTML, Less, and CSS. Using standard grid layouts also helps but is not mandatory. It even adds a sense of consistency and a regular process for updates. One core aspect of the approach is that it is future proof using HTML5 and CSS3 and very well documented. Third-party assisted customizations are also possible for typography, layouts, javascript enabled interactions. The framework makes design approachable by anyone and a lot less frustrating for the developer as well. It is also becoming the framework of choice for many in development of visualization, portfolios, and dashboards. Although, there are a few drawbacks to using it that one needs to keep in mind. It may not provide for best practices of separating content from presentation. And, at times it may require modifications when it collides with existing setups. Also, it has quite a  memory size as a result of heavy functionality that is baked into the framework. Customization is also a major requirement that becomes necessary with bootstrap. Otherwise, the layout will end up looking similar to everyone else and does not allow for much differential for uniqueness. Bootstrap is also being used quite heavily for single page websites. Alternatively, one can always go back to hand-coding everything from scratch, if that makes one more productive and efficient. The framework may not be the best choice for a web designer focused individual at times who may be more driven by standards and require more flexibility for handcrafting everything. All in all, even if bootstrap does have certain limitations, it does provide a good alternative for developers that are more inclined towards building frontends quickly while spending more time doing backend work. Furthermore, it has a nice integration with Bower which is a package manager for the web or perhaps more of a web component installer. One could use Yeoman for code generation and Grunt as a builder and utility tool which can be installed via node package manager

Social Parenting And Linked Data

Social media is a hot spot for all age groups. However, at times, parents have no clue about what their children are getting up to online and whether it is age appropriate. There are efforts to incorporate more age controls from internet service providers, but this does not restrict the varying terms and rules of service that each web application holds. There also needs to be more done to circumvent peer-pressure driven suicides that result from social media outlets which in reality could be prevented by the level of insecurities they extrapolate on to an individual. This may further be expounded as a form of social bullying. However, this naturally falls on the responsibility of a parent to be aware of what their children are up to at least with the sense to protect them within the respects of privacy. Linked data in some ways also holds the key towards making social web more accessible for controls on the internet which makes it a safer place away from predatory afflictions. It seems that people over decades have become more susceptible to influences and outside forces to effect their personal space of existence in the world and cloud their judgement. In some ways, by allowing more connected data we may be able to make accessible the controls necessary to elaborate on content restrictions. Although, internet is an open network providing much individual freedoms there is still an element of netiquette and responsibility to be maintained. Parents often need help from the technology providers to assist them in being responsible. But, at same time what underage children do is really the responsibility of a parent.

Spring for Big Data

Over the last few years, SpringSource has been really pushing more modular projects in the Big Data and Cloud arena. There are projects being spun up to resolve complex issues and make life easier for developers as a whole often at the ease of making most of such development needs accessible freely. Some of the solid new projects that have come on are Spring XD, Spring Data, Spring YARN, Spring Hadoop, and Spring Batch. They all attempt to resolve context specific issues via separation of concerns and incorporate more in way of developer productivity. 

Spring Boot

Spring Boot is a new and evolving platform which has only recently had a milestone. Perhaps, it is an alternative being incorporated in order to improve or replace the often flawed and frustrating tooling of Spring Roo. The platform takes on an opinionated view to ease and reduce the ramp up time for rapid development of projects by users. It builds on the existing modules of Spring to provide a further enhancement for flexibility for production-grade development of projects and allow one to use what they need based on evolving requirements. One core feature of the approach is to move away from code generation and excessive use of XML configurations. It even incorporates Groovy scripts at the command line. Perhaps, the approach is also taking much inspiration from the Play framework but adjusting the flexible needs of a typical Spring developer. The approach also means more help features to ease developer productivity especially alongside cloud deployments and Web services. 

Intelligent TV

It would be interesting if TVs of today were a lot more intelligent in reflecting our habits and tastes as well as catering to our personalization needs. At any given time, they should have the capacity to learn our natural patterns of viewing and incorporate that into providing us with a schedule of shows through the day. This is one aspect that will optimize a TV viewership especially as network services work to achieve such goals in chasing after ratings. Even advertising markets could change dynamically to the habits of viewers. Also, on satellite and cable TV there are just so many channels available. Unless, one has a double screen TV or a view-in option, it would be difficult to watch multiple shows at once. Catering to individual needs is the answer to many. It also means people can watch all their missed shows to match their busy lives. One thing, that TV does is show repeats of shows so people could catch-up where they have missed. However, such shows also invariably repeat a little too much boring the likes of many as well. There is also the need to move the notion of a TV and transform it into a more ubiquitous viewing. The utility of a TV needs to harness its surroundings within the everyday gadget of a home. It needs to be available over the internet. And, it needs to allow people to connect socially. Internet will also make quality of viewing possible as well especially in order to view channels that have border restrictions. Even the approach to 3-D TVs needs to be pushed faster in advancement so people do not have to wear glasses in order to take use of such a facility. The more intelligent our electronics become the more versatile they are for us to use, and the more we can take advantage of them as part of our everyday lives. TVs could even take inspiration from the minority report  or even the marvels of shield and provide us a more thinner set of glass displays to advance most display technology in general.

NLP vs CL

There are so many synonymous terms floating around in the technical community to confuse the likes of many in field as well. Even terms like computational intelligence, soft computing, and artificial intelligence can be such grey areas beyond the fields of research. But, within applied work in business sectors no one bothers to wonder one way or the other as long as the method is sound and logically transpires to a workable solution. Natural language processing and computational linguistics are also such grey areas and in business often seen as synonymous to what they actually do. There is hardly much of a difference other than the way they approach certain linguistic problems. Perhaps, computational linguistics is more focused towards understanding the language through the use of computational tools. Maybe, even creating such computational tools in the process. It certainly involves building and analyzing textual corpus. However, natural language processing takes the linguistic approach to tools a step further by applying such tools to providing a practical solution to a problem of deriving meaning when given a set of input and output variables where such computational models can be applied in context. But, in case for many, these fields are quite overlapping as far as roles are concerned.

Dynamic Programming

Often web scale problems are large requiring working with big data. Dealing with such cases means breaking problems down into sub-problems. One can often use divide-and-conquer approaches which are applied recursively in context to sorting and searching such as mergesort, quicksort, and binary search. These algorithms work from a top-down fashion. What this implies is that in divide-and-conquer the approach is used to break down problems into sub-problems in order to solve the original problem. However, at times it is necessary to work from bottom-up especially in case of web search. In this approach sub-problems are solved recursively to solve larger problems. As a solution to a sub-problem is found it is stored for later lookup. Again, dynamic programming is useful in domains of advertising and complex distributed analytics on big data.  In this respect, one is trying to find the optimal solution. The principle holds when every optimal solution to a problem pretty much contains optimal solutions to all sub-problems. However, the reverse is not the case. A typical problem to dynamic programming is the knapsack problem which does not work well with a greedy approach.  Although, dynamic programming is not appropriate for all problems it does offer a very useful optimization strategy to solving very complex situations which ordinarily would take exponential time to solve for optimality.

AMP

Keyphrase Extraction With Maui And Kea

Keyphrase extraction is a method of obtaining for indexing the most frequently occurring or important phrases in the context of the application. They can be useful for search engines in indexing document collections, for advertising, and many other domains. Carrot2 is often used as an embedded clustering service within Solr search. However, for a semantic web point of view, it is useful to have vocabularies to work with in order to attain rich extractions in context based on the defined dictionary of terms. Kea is one useful keyphrase extraction library that utilizes SKOS vocabularies making rich extraction very useful. It is further extended through Maui as an indexer and has an integration for machine learning via Weka. It is ideal when one wants to extract on specific context and then to further disambiguate such contexts based on substantial custom controlled vocabulary of terms over a large set of document sources. This can further be extended with use of Maui acting as the automatic indexer for topical tagging, keyphrases, keywords, descriptors, and specific terms.

Semantic Jokes

Finding jokes online is like trawling through heaps of irregular websites and sources. At times the jokes are not even all that funny. There are also a whole range of joke topics available and constantly growing as appetites and tastes for humor changes over time. Jokes are also an art form often deeply rooted in unique styles and personal wit that is characteristic in the manner with which it is told. However, on the web, reading interesting jokes is still a very much seeped in unchartered waters of unearthing the right joke for the right context - a very much problematic exercise. Here in lies knowledge discovery from exploring the deep web as a graph and contextually for sources of jokes embedded within web documents and extracting the many facades in semantics. Semantic web for jokes is a right form to apply here as well as a unique ontology to go with it. Moreover, a linked data for exploring the entire web as well as connecting the many joke sites and numerous sources for aggregation. People desire humor in their lives and is often the cure for many ills. Furthermore, the application of such semantically aggregated exploration could then be used for building useful search sites but also to provide smart natural language aware bots that could provide for a witty personable experience to a user both via the desktop browser as well as on mobile. Once a structure in contextual knowledge discovery is in place, there are insurmountable uses that could be applied providing the answer to the web in form of extensibility and reuse. 

GitHub Websites

GitHub is a very useful platform for source control with its distributed model possibly the best there is for flexibility and community driven projects. It also has a very interesting option of hosting free project or portfolio sites using the GitHub Pages. Simple and sleek looking websites could be built in a matter of minutes as well as with the addition of blogging from same repo base. Although, it only really supports simple static HTML and CSS which is often enough to get a quick website up and running against a domain.  It also uses a very simple model of blog sites with Jekyll. An alternative to that in Python would be Hyde. An alternative to GitHub is Bitbucket which is also a very interesting platform. It is further used for the Atlassian Stash platform for the enterprise. Although, for the premium services, the Stash platform may work out cheaper for the enterprise compared to GitHub. And, even Bitbucket which on a free option allows for unlimited private repositories compared to GitHub which only allows for unlimited public repositories on the free plan.

Smart Software Development

In the current times, software development still requires the intervention of a developer in order to provide the input for requirements, design, implementation, testing, build and deployment. This is all part of the software development life cycle. Although, with the cloud we have seen much in way of build and deployment automation. There is however, much that could be improved in the other phases. Requirements for one can utilize statistical natural language processing models. Also, there is the UML approach which could be incorporated further using artificial intelligence to automate the process of design, implementation, and even testing of the code base and even for data modelling. The artificial intelligence would also be a lot faster in learning the syntax and semantics of a language compared to a developer. In that respect, the development of a model would really only be the necessary role of a human developer or engineer. Using a model-driven approach is key towards further automation. These approaches would require knowledge representation techniques of model checking and automated reasoning which are used extensively for circuit designs. Genetic algorithms provide much in way of code generation and refactoring by their simple approaches of mutation and crossover techniques towards a global optimization over the search space. It is a lot more effective these days to attain a computer science degree than a software engineering degree. As the role, of a computer scientist will become key in the evolution of software development in the years and decades to come. Already, artificial intelligence is playing a key role in data mining, medical applications, and testing. It is only in time when large scale development work will also start to witness significant automation in abstractions within the key areas of software development life cycle. 

Smart Postal Services

Postal services are becoming quite antiquated in their approach. It is also for many a struggling business not to mention the extra level of care they have to maintain these days for mail security. There has to be better and more effective ways for postal services. In a matter of speaking the role of a postal service has to evolve drastically to meet the pressures and demands of a digital age both for individuals as well as businesses. Digital mail is seen as an alternative to standard postal mail. However, how much can one digitize in form of mail is another story of data protection. Postal splits into two streams of courier and general postal services. For mail sorting, they could use more robotics into specific functions and gradually phase them in the process. For postal deliveries, they can apply the travelling salesman problem as a form of effective route planning which goes further than just a simple shortest path problem. They could also enable bar codes and GPS tracking on standard mail for a more effective means of streamlining sorting and delivery. Another, alternative is to use RFID tags. Even semantic web and linked data approaches could be applied here to harness the web using geonames. There is also a need to reduce the time required for deliveries as well as an increase in reliability. Postal services also need to incorporate themselves towards the option of emails for standard letter parcels in a digitized form. There also needs to be an automated way of allowing users to scan and register their own postage as well as more flexible ways of mailing services. Amazon is using an interesting option of providing in-store delivery boxes for customers, which is quite a good idea. Something similar could be applied to the general postal services as well. In process, ecommerce companies, like Amazon would not need to create their own services. Perhaps, this is an indication that postal services are not effective enough for ecommerce. It also seems that drones should be applied for local postal systems based on their specific districts enabling them to geotrack senders and recipients. Even personalized services for pickup should be made available. Postal services could also incorporate advertising into their system of delivery. This on its own could add a significant portion of revenues. Most postal service sites also lack responsiveness in their interfaces. They also provide very limited mobile alternatives. Analytics is another options that could add further optimizations. In general, postal services need to incorporate the digital age in their business objectives and harness the web towards diversification of services. There is much that can be improved in the postal services and not just for delivery but also for all their standard mail functions. 

22 March 2014

Security Updates

Desktop and web software often provide automatic updates for new releases and patches. However, at times such updates can cause havoc on one's PC and potential conflicts with other existing applications. Fortunately, as it goes for most types of software there is usually a workaround to rollback an update to a previous snapshot. For a lot of web software, this is not always the case. One also needs to be aware of the context of such updates. Windows and Mac often have options for automatic updates, for example. And, so do web browsers, if they are set to update automatically that is. However, with all the intrusions and privacy concerns floating around with snooping on internet users, there are no sufficient safeguards in place to combat it. There is speculation that certain companies may be cooperating with such institutions to provide access for such snooping to take place. One of the very obvious ways that this can be planted is via the standard software updates that take place at any one time in the background without the user even being aware of it. An obvious way to avoid such a mishap is to not set such updates to happen automatically. Another, way is simply to disconnect the PC from the internet altogether when not in use and watch if there are any background jobs that seem to be getting stuck in process. It is also always handy to maintain snapshots or backups of the operating system periodically and regularly scan the PC for malware. Also, some software require an open TCP port entry when running which is also an obvious way of entry. Having an effective firewall strategy is useful too. This may be all too basic but such things often get overlooked by regular users as part of standard practice. Web security is a big concern nowadays. And, one very obvious snooping access is via email messages. Businesses often use digital mail encryption during transmissions to protect against any manner of snooping. As users face more and more security concerns, there will always be people looking to build preventative software. And, in order to keep in business there is always a need for people to provide for vulnerabilities. Internet has created not only a lot of benefits but also huge levels of security issues in process. Unfortunately, as one consents to using the internet, one also has to take responsibility for their actions including the protection of their own privacy. Internet is one place which has no borders and where anarchy pretty much would reign if it were not for standardizations.

21 March 2014

Dating Sites

It is a sad world where people find it difficult to even meet in person that they have to go on dating sites. There are dating sites and match makers all over the internet. Each providing some form of stickiness. However, they all want the person to subscribe in some way or another. It is even worse when someone has to pay a subscription to meet people online. It is also not the safest option. Internet is one place where free speech and opportunities abound. However, there are also people out to exploit others. There has to be a better approach to people connecting with like minded individuals. Meetup is perhaps an even better approach. At least, the process gets people connected at a very contextually specific level and transforms it into a personable group activity. But, it is still in essence a very disconnected social experience. Perhaps, a better alternative might be to use semantic web with all its metadata and build a linked data of communities. Similar to linked data of discussions. The social web has endless possibilities. People can even build their connections via mobile phone and even connect in real-time using bluetooth while they are out and about. One will never know another by just meeting them online. It has to be an in person encounter. Dating sites have never really been all that popular either on the web compared to social networking sites like facebook and twitter. There is also a real need here towards a more semantically connected identity allowing people to be more mobile but at same time safe from fraud. Dating sites can also be a very shallow place to meet people in similar respects to facebook. Perhaps, this can be equated to a major drawback of capitalist society. People tend to become more individual in their mind set. But, at same time there are many that end up living a very separate existence. This is typical of big city living where people may appear cold towards others and more vigilant of their surroundings. Dating sites also appear to be more popular by people living in rural areas rather than urban. This is in many ways an indication of how people living in different parts of the world feel drawn and connected to the opposite lifestyles out of curiosity. People also live busier lives compared to the way people lived and worked decades ago. But, there has always been a need in people to share their existence with someone else. Surely, dating sites cannot be the answer if bars and clubs are so popular in a city. Perhaps, lack of services, busy family lives, and boredom in the local communities draws individuals on to such sites.  But, building another disconnected site is not the answer any more to a social web. 

20 March 2014

Queues

Queues are almost the work horses of a backend application. They provide a streamlined platform for externalizing problems faced by large production systems in order to coordinate, mediate, and route processing requirements in different formats and for various domain needs. This may also incorporate transformation formats in one form or another for delivery. They also provide ways of massive scalability and load balancing between producers and consumers. At times, these services are necessary to maintain low-latency requirements of an application. Latency is the time it takes for a message or a block of data to travel or perhaps the round trip delay which has an effect on the response time. In principle, high latency is considered bad vs low latency. Over the years, many protocol standards have established and are still a continuous maturity process. A huge variety of open source queue service options are available. Unfortunately, they all have benefits and certain drawbacks.

The Architecture Of Open Source Applications

18 March 2014

Java 8

Today marks the arrival of a new Java version release 8. It also marks the start of a new beginning for the programming language platform supporting further the use of functional programming and lambda. The benefits of lambda, in theory, are to effectively use concurrency as well as the callback method of programming that is typical makeup of cloud based development. Obviously, as it always happens, during a new release, there will be some uptime during which some will quickly look to installing and playing with it, others will stall their upgrades until any issues are found in the community, and there will be others who will languish in the version 6 and will only just start to migrate to version 7. Also, there will certainly be language related libraries and frameworks that would need the upgrade in order to support the new platform. This means another quarter of version upgrades and rediscovery within the community for the platform. It also means a closer fit with the Scala language overtime.

5 features java 8 will change

Comic Services

Flip book comics can be a lot of fun. However, smarter comics could also be fun. Make one's own comic strips could be even better. Why not use comics as a way of teaching kids to learn things. Most kids and even adults enjoy comics. It is one of those things that one never grows out of. Comics nourish our youth and provide us with a feeling of nostalgia. And, yet they are also filled with stories in form of bubbles. Creating new characters and making them come alive could also be quite cool. And, yet helping kids develop their own flip books could be one way of teaching kids to read in the most simple of ways. Also, smart comic services could be introduced to develop better stories with specific traits. Even collaborative comics could be a new form of social networking. Going even further, connected comic stories where people could communicate in form of comic bubbles and forge them as part their memories. Further, even comic services could provide a means of linked data where web of stories could be translated into comic strips. Even coded network security could form comics that could be translated into binary at point of encode and decode of messages. Comic social networking could be quite fun with a way to create your own strips of episodes building within a sub-plot for different characters. One could even form knowledge discovery and store them as preservation pieces of comic stories. Comics have been around for generations upon generations. And, yet as we grow old they begin to dwindle as part of our every day lives. They are a treasure chest of memories that could be treated like think caps into our every day lives perpetuating the very stories that we live everyday. Perhaps, there really could be an epitome of a hero in all of us.

17 March 2014

Semantic Images

There are quite a lot of image sites on the web. Some are socially driven and others are for stock photos. It would help if such image sites provided more semantics into their images. Images really need to be readable by crawlers. They also need to be easy to interpret as part of content extraction. Semantic ontology for images would help define at a metadata level what it is contained in them in order to provide to some degree the subliminal cues to make retrofit connections of the rich document content. It would also allow for better linked data searching. With so many dispersed stock image sites it would be useful if they all worked together in a connected way, sharing in their content and making it accessible to all. When one looks at images on the web, one notices so much that is disconnected from knowledge discovery on the web. Text and images provide for rich sources of content. There is an awful lot of content on the internet yet to be tapped by crawlers in order to penetrate the deep web and allow for more content analytics as well as ambient intelligence.

Kaggle

One of the coolest ways to build data science experience is to jump right into online competitions and learn the methods with hands-on applications. There are even rewards to be had. Plus, being a Kaggle competitor makes one showcase their work to potential employers. Fasten the seat belts and dive deep into the big data analytics. It is a crowdsourcing alternative to finding solutions to big data challenges in predictive modeling and analytics. Kaggle competitions are a serious business. And, when one does find a solution, the license rights are handed over to the host - which is a bit of a niggle of sorts.

Vertx

A new breed of event-driven server-side and asynchronous IO frameworks have emerged. Most are already aware of Nodejs as a specific Javascript aligned framework. However, Vert.x is emerging to be the pinnacle of simplicity and sophistication for a JVM enabled developer. It seems the model is just so flexible and scalable that it merits a clear win over Nodejs. Perhaps, the underlining question here is between Play Framework and Vert.x. Although, this platform is still fairly new, it does tend to support the polyglot programming. It would certainly help if there were more plugins available on the platform to make life even easier. Re-inventing the wheel may be the way to go with Vert.x. But, perhaps, the pain of starting from scratch is far outweighed by the sheer simplicity and versatility of the approach. In time we will see how Vert.x will transpire over the Nodejs approach with the many application user cases in production. At least, for the sake of security, Vert.x is a clear winner and even for scalability. Vert.x will be much preferred over the Nodejs in the financial services community. But, also by big data miners who may have a sheer dislike of the Javascript coding style. It will be the day of reckoning as to how such frameworks converge in time to provide some view of their usefulness. It also does seem that the development community is rather unfazed by such frameworks in a form of reluctance. Maybe, it is to the fact that they are just too new in their approaches to balance the developer training and cost of new development.

Freebase vs DBPedia

Freebase and DBPedia are both community supported semantic databases of content. However, on one hand Freebase is curated for people, places, and things. While, DBPedia takes most of the information from Wikipedia. The two databases are now interlinked in a linked data albeit in partial set of topics. However, their approach is different. Freebase utilizes the Meta Query Language as a customized option whereas DBPedia supports SPARQL. DBPedia also has a large academic community with other side semantic based projects, whereas Freebase is owned by Google. Most such databases in order to be classed as open data need to share a very open license policy on data. The choice of which one to use depends entirely on one's application needs. The curated data may be different across the two databases based on the data sources. In fact, even Freebase uses data extractions from Wikipedia. They both have different goals, schemas, and identifiers. Freebase perhaps is more diverse in its use of data sources. Freebase also makes it freely accessible to users to curate the data. However, in order for one to get an update on DBpedia, they would have to first update it on Wikipedia. Even the structure of data storage can be slightly different. Freebase is based on n-tuples whereas DBPedia is based on RDF. Freebase is more in tune with the open data community whereas DBpedia tries to follow the strict approaches of the Semantic Web. Even the tool development is mostly via third-party for DBPedia, but mostly from Google on Freebase as well as the user community. One could utilize both in a linked data to build meshable vocabularies, taxonomies, thesauri, or even topic maps.

16 March 2014

Beautiful Data

Data can take on many dimensions, shapes, contexts, and infinite forms. It is no wonder that information is the only boundary limitation with which so much data can be elaborated. One set of data alone can provide for so many informational uses. Perhaps, it is the information that is beautiful not really the data that is collected. The elegance of stories that can be told is the information driver to a data source. Visualizations inspire insights that are driven through the lens of information and contextualized from a set of data sources.

beautiful data images
visualizations awards
beautiful data with oreilly

Aurora Borealis

Interesting images on flickr for aurora borealis give a new dimension to the natural phenomena. Simulations of such levels can give huge meaning in the way we can illuminate data as well. Nature often is the miracle cure for many of the web and artificial intelligence challenges.

aurora borealis flickr

Wolfram Language

Gone are the days of when programmers have to define their own types of abstractions. Knowledge based languages are almost the new approach to making life easier. Wolfram Language is a symbolic, natural, knowledge-driven, and extremely large. But, it can be used in a multitude of specialized domains. The Mathematica uses it. The Wolfram Alpha uses it. What is so powerful is that the knowledge is pre-built inside the language making it aware of its domain semantics in a programmable context. It makes input of data and translation of output much easier. Enabling it to represent arbitrary data with ease. What Wolfram Language attempts to do is make the world more computable rather than being able to just generate information. General in its approach it combines a multitude of programming paradigms from symbolic computation, functional, to rule-based.  

Visualizing Changes In Habits

It would be a useful application to have that can provide visualizations on the characteristic habits of people over the web and how they change over time. In process, one can develop a sense of deduction and conduct sophisticated behavior analysis. Understand people and how social phenomena develops on the web can provide cues to predicting future intents in any given topical constraint of interest. The versatility of such an application would also not be intrusive but assistive to providing better all-round optimized services on web. Underlining habits define the core ideals of what develops into current tastes, trends, and interests of individuals or of particular groups of communities. It is an almost graphical depiction in sociology of the web. Visualizations often tell more than what words could ever foretell almost defining and exposing hidden facts. Even the domains can be contextualized to provide further focus points.

Web Search For Missing Persons

Public services really need to open up a lot of their data especially for emergencies as communities can really help in process through identifying patterns in data and trends. There is one area which is severely lacking and that is the assistance that is available for parents or families who have to struggle through the information of missing persons. They often have to rely on public services that are at times incompetent, slow, and restrictive with their investigations. Perhaps, there should be a web search available for people to search through missing persons and allow them to track histories, whereabouts as well as map associations which can link them to specific targets to allow a way to locate the victims and the perpetrators. At least it would help speed things up and give the victim a chance to be found before they are hurt or lose hope. Many families go through such an ordeal never to find their missing members again. There really needs to be more done by way of making government databases available for searching and then for members of community which can filter and gather results so people are able to search freely in the process. Semantic Web, Geo Location Awareness, and Machine Learning are a few approaches that could help in such a domain in analysis and contextualizing informational knowledge from data.

Google Hummingbird

The hummingbird is a new approach being devised by Google to search and sort through the information contained in web pages and the context of queries to result in the best answers. In fact, this is in addition to PageRank an often embedded algorithm. What the name essentially implies is precise and fast. Search momentum is towards more conversational approaches leaning towards question/answering that provides better results based on intent and context. The enhancement allows a way to connect a user's queries in context for smarter search results. A funneled approach to the search would imply that it takes the first step of browsing behavior, next the shortlisting of result sets, and perhaps even the condensation of buying as a form of intent. At top is almost the mirror of browsing for information. At the layer down implies that one is at an exploring stage for options, and the below stage implies the intentions of buying. Perhaps, this actionable intent provides for smarter content generation to publishers and better contextual marketing which implies better visibility, more valid answers to questions, and increase in contextual value to content. At least, theoretically speaking, as the approach to the new search algorithm is yet still relatively new.  

London Formula One

It would definitely be amazing if such an event could take place in London. With so many historical sections of the city on display, it would mean a massive economic boost not to mention tourism. However, the traffic and congestion would certainly need to be thought through carefully as such event in practicality would need to be staged on a weekend especially outside of rush hour. Plus, London really does go into a standstill at even the slight bit of weather changes. Almost every year there is an issue from even little bit of thunderstorms to the sprinkling of flurries. Outside of London it is almost always about flooding. The event will also spark more crowds within an already crowded city. However, there is just such a sheer amount of big data that can be cultivated from such an event for huge urban benefit even to understand the market and social trends. Also, it would force London to push towards a plan for a cleaner city as it would also bring a huge load of garbage. One could also argue that London's carbon footprint could grow to leaps and bounds from such a fuel guzzling event. Perhaps, it is a waiting game till the dream eventually does become a reality. It surely would be a dream come true for advertising sponsors and property developers. House prices could even skyrocket forcing people to shift more and more outside of London.

formula one telegraph

WorldCat

An integration of libraries is emerging. WorldCat is one such effort by the synergistic collaboration with multiple libraries spanning the globe. Although, with huge effort comes quite a few obstacles for integration and for that it still has a lot of issues to overcome. Semantic web should really be their cornerstone for an ideal. The domain should really be mirroring a huge custom ontology. WorldCat provides a search facility over the indexed libraries that are registered for connection accessible for a way of finding local resources. The service even allows one to connect to a librarian and provide a review of the resources. There is albeit a requirement for membership to view or download.

FAST Linked Data
OCLC Experiment
GIST
OCLC
dataliberate

Smart DNA Analysis For Medical Checkups

It is no longer enough for a patient to trust a doctor. These days there are so many malpractice lawsuits where the doctor is shown to be negligent during the course of treatment, surgery, or a standard examination. As patients, we need to be conscious of our bodies and about our genetic makeup which is unique to our DNA. And, as such our DNA can tell a lot about us as humans and as individuals. We all have different ageing processes, different levels of stress, different anxieties, different afflictions to treatments, and even the results to diagnosis can be different. Even the environment we live in can have effects on our bodies. Perhaps, we really need to take on a more broader outlook to our medical checkups not just for short term but for long term to plan and understand the many natural virtues and remedies that would be unique to our own bodies. Smart DNA Analysis via machine learning and neural processes is one way of harnessing better assistive medical checkups to provide a more thorough examination of a patient. It is invariably expected that patients will seek second or even multiple opinions from doctors before they can fully know conclusively. Trusting a doctor is a massive aspect to medicine for which doctors are expected to maintain a standard of ethics. But, having an automated analysis process for a second opinion does not harm anyone either. Why not use it to provide patients with a full medical history and even identify cures to infections and diseases that have even yet to be discovered. It is also way to avoid using animal testing for humans. DNA at times is the only answer to understand ourselves. And, doing analysis by way of data mining can help understand a patients metaphorical medical problems either now or in future. Perhaps, even providing an answer to how they live now and what sort of changes they can make in their daily lives to provide preventative measures. This will also help significantly to reduce the cost of publicly provided medical services. Often times patients would like to know more about their health and providing them with a full report may not be fear mongering but more about allowing them to see the bigger picture to take control of their own lives. As long as, such analysis are not used for serpent-like affairs they could prove quite useful to the general public and reduce the number of doctor visits required to analyze medical problems. And, in fact even allow a doctor to seek immediate second opinion if in any doubt of their own medical judgement. There may be requirements for protection of medical records but that can usually be provided from a patient-doctor confidentiality. This is perhaps one form of robotics that can be useful in medical practice. Not for replacement of doctors but for their assistance. Building case histories could even add to the cumulative feedback loop for treatment, audits, and further assessments.

St. Patrick's Day

It is interesting how the sudden color of green appears everywhere on St Patrick's Day and alongside copious drinking. Green architecture seems to sprawl everywhere and come alive in celebrations with the symbolic shamrock tainting many cities of the shades of richness. Green is symbolic to the richness in leaves and of the visible light located between yellow and blue as well as making up one of the additive primary colors. The associations one can make with green are huge and quite powerful - nature, environment movement, Ireland, spring, hope, and even envy. The below links illustrates the social and historical significance of the day as it is celebrated across the time zones.

Change.org

An interesting effort to bring together the synergy of social web and harmonize towards a means of action for a cause is change.org. One of the few web sites that are providing a web facility that can allow the collection of petitions and provide for a change while at same time measuring the vocal power for the internet and the accessibility that it provides for all. In a manner of speaking, where voices can't be heard, the web is making all things possible for many. There is an immense power that the written voice of many over a few which can provide for a surmountable change or at least for a sense of awareness and take the steps necessary towards a serious effort. There seems to be a poignant notion here that makes everyone feel a sense of necessity for what they believe in for which one can gather hope with support that they recieve in just signatures and have their voices heard. Maybe, a few signatures does hold a huge power towards making change when so many get together to support a cause without even needing to rally on streets. It makes the web feel more of a civilized vehicle as well as a potent weapon for political and social change.

Non-Stop

A pretty dull movie with a lot of excessive facial drama rather than any immersive action. It was not exactly non-stop but every 20 minutes of drama. It felt almost being stuck watching a fear mongering movie that attached an immense amount of typical stereotypes. Plus, there seems to be more marketing involved here and less of quality entertainment. Also, it seems like it was heavily sponsored by a mobile manufacturer as the scenes pretty much depicted every few minutes an interaction with a mobile phone. The movie almost dabbles with snakes on a plane and yet spirals with one dubious character after another. However, it does prove that racial profiling is typical and almost always incorrect in most situations. Liam Neeson at times shows that he can still take the 'Taken' character and play it out in multiple different scenarios with a bit of gusto. Perhaps, this is what makes the movie seem at times workable if not totally pointless.

Smart Property Services

We live in a society where people want more out of data. And, yet there is so much of an information explosion that we require more context and smarter aligned services to our personalized needs. Property services are quite a drag and yet people want to derive more information about their local areas, about how their property prices are fluctuating. Even buyers and sellers have different contexts of needs for information. The complexities are further compounded by the different ways cities operate and the legal aspects to it all. There are many property search sites available on the internet. There is even an explosion of estate agencies. It would help if property services could do more for the buyer, seller, and the tenant. Open urban data certainly helps for clear visualization of patterns. But, it is also important to know the right places to look as well as the history of the property. Doing excessive background checks on tenants does not help either because it does not tell clearly enough about a tenant. Or, whether a seller or a buyer is likely to back off on an offer. Or, even if a mortgage application is likely to be approved. People want smart services that can do more than provide a list of information. They want to be assisted through the many unique and typical concerns that at times need to be personalized to each individual circumstances. Even smart calculators and planning tools help to alleviate much of the grind work. There are also very few smart services out there to help the tenant, the student, or even a first time buyers. There is also a lot of options to add semantic contexts to services. Buyers and sellers want to be able to make informed decisions. And, tenants want to be able to find accommodations as fast as possible, and with their specific circumstances knowing all the information about locations and their chances of getting a place for the right value. Perhaps, there needs to be more done in the property domain to cater for the many unique needs in form of creative and intelligent applications that can provide for a clear informational visualization with intuitive interaction designs. 

Drones For Services

Drones are starting to be used in experiments for postal, takeaway, and even for grocery shopping. However, they do have some serious drawbacks. For one thing, they can be used for snooping on others and invading our rights to privacy. At same time, they could even fall prey to pranks and attacks from humans and even other drones. An almost simulated drone warfare could become a nightmare for public services. Especially, if there are visible electric wires in the neighborhood. Even the manoeuvring of drones can seem like a big maze over obstacles in form of tall trees, chimneys, tall buildings, or even birds. What about migratory birds and how drones could effect their seasonal behavior. Nature could even play its way in form of hail, rain, heavy storms. Perhaps, it even means the loss of the delivery man and the end to commissions. Can we really be assured with the way a drone delivers our food or perhaps the way it delivers our post through GPS tracking?  Can a drone really be taught all the various unpredictable things of human society and the manners of city life. Simulations are just not enough to provide a clear level of intelligence for the unpredictable and the uncertainties of nature and human societies. A drone really needs more than neural capacity. It really needs a degree of conscience. The will to make conscious decisions based on the changing surroundings and the best course of action that is humanly possible. Drones can also cost a lot more to enable and for humans to adapt to within their daily lives. Businesses often take a lot longer to realize the full potentials of anything that drives in change to the status quo. It may appear the drive for more robotics in business is a clear way for over added big brother beyond the reproach of visible cameras. So, perhaps, the increase in interest and the creative uses for drones really needs to be taken with an open mind and with a pinch of salt. As not all such creative ideas have an underlining ethical intent or so called productive use in mind. Perhaps, there is more of a drive there for hidden levels of monitoring of people that hides behind the covers of plausibility verse deniability which is further cultivated as a way of adapting new experiences into society. Or, maybe, people really can use the moral high ground and build drones that make our society more efficient and productive. Ubiquity is the underlining driver into drones so are web services, and with web services you can add more added semantic value, which also means they can act as multiagents with deductive reasoning abilities.