Difference between revisions of "WebService"

From PlantUse English
Jump to: navigation, search
(Content Export)
(Import Contents)
Line 33: Line 33:
  
 
Import contents from external websites with 2 needs :
 
Import contents from external websites with 2 needs :
- Give existing groups usefull informations to encourage participation.
+
* Give existing groups usefull informations to encourage participation.
- Complete informations of the MW with data extracted from internationnal databanks.
+
* Complete informations of the MW with data extracted from internationnal databanks.
  
 
=== How? ===
 
=== How? ===

Revision as of 08:42, 14 April 2011

Goals

Firstly, a WebService will be developed to extract data from a Mediawiki for third party website display and treatement (graphs, schemes, ...)

Then, in order to complement and enrich the content available on the MW, Web Services (a Web service by external source) will allow data harvesting from external sources (GRIN, Mansfeld, Prota4U, ...).

Content Export

The export of textual content will have 2 objectives :

  • Displaying of text on a distant website, with the possibility of displaying it as a whole or only a part of it.
  • To propose a structure for exported data in order to allow treatments equivalents to those possible with data from a "real" database

How ?

1) Explore all the possibilities included in Mediawiki for exporting pages.

2) Work on standardization of pages structure on our MW

3) Uniformize the structure for data of the same type (taxobox, phylogeny, ...)

4) Develop webservices based on REST architecture to get XML file related to page using common structure

5) Use those WebServices to print data from the wiki in third party websites.

6) Export contents trough those Web services.

- allow the export of a whole page. - allow the export of a precise dataType ex : a particular template.

7) Get (XML ?) contents from webservices and display them as graphs/shemes.

Import Contents

Import contents from external websites with 2 needs :

  • Give existing groups usefull informations to encourage participation.
  • Complete informations of the MW with data extracted from internationnal databanks.

How?

1) Determine what is the common structure for all the reference website (maybe HTML)

2) Discover specific data structures of GRIN, MANSFELD and GBIFF

3) Create a Web Service for the actual import of data in the mediaWiki

4) Define witch data are of interest and importing them in the MW.

Technological choice

Not define at this time : maybe PHP in conjonction with JQUERY and REST architecture, and data encoded in JSON.

Eventually, the WebServices must be able to recognize requests from both a partially controlled (MW) and Out of Control (third party sites) souces, also to recognize the relevant information, retrieve, format, and return them to the sender of the request.