Search Results

Search found 21861 results on 875 pages for 'external library'.

Page 179/875 | < Previous Page | 175 176 177 178 179 180 181 182 183 184 185 186  | Next Page >

  • What to do if one library is not multi-threaded ?

    - by LB
    Hi, I would like to multi-thread an application, however one library i'm using is not multi-thread capable (i don't know what's the right word ? synchronized ?). What are my options ? As far as i know there's nothing in between threads and processes (Runtime.exec) in java (no abstraction in the jvm to have something like an isolated "java process"). How would you deal with that ?

    Read the article

  • LVM and cloning HDs

    - by jcea
    Using Linux, I have several backup levels. One of them is a periodical sector by sector copy (using dd) of my laptop harddisk to an external USB disk. Yes, I have other backups too, like remote rsync. This approach (the disk dd) is OK when cloning a HDD with no LVM volumes, since I can plug the external disk anytime and mount the partitions simply mounting /dev/sdb* instead of /dev/sda*. Trivial and handy. Today I moved ALL my harddisk (including the /boot) to LVM. Everything works fine. I will stress it for a couple of days, and then I will do a sector by sector copy to my external harddisk. Now I have a problem, I guess. If in the future I plug the external USB HDD to recover any file, the OS will detect a duplicate LVM configuration, with the same name and the same UUID. Even doing a vgrename (which LVM would be renamed, the internal HDD or the external HDD?), the cloned UUID will not change. Is there any command to change name and UUID? Ideally I would clone the HDD and then change the LVM group name and its UUID, but I don't know how to do it. Another related issue would be... In the past I have booted my laptop using the external disk, using the BIOS boot menu and changing GRUB entries manually to boot from /dev/sdb instead of /dev/sda. But now my current GRUB configuration boots directly from a LVM logical volume, something like: set root='(LVM-root)' in my grub.cfg. So... What is going to happen with duplicated volumes? Any suggestion? I guess I could repartition my external harddisk and change backup strategy from dd to rsync, but this disk has windows installed too, and I really would like to have a physical "real" copy.

    Read the article

  • Using UIScreen to drive a VGA display - doesn't seem to show the UIWindow?

    - by Peter Hajas
    HI there, I'm trying to use UIScreen to drive a separate screen with the VGA dongle on my iPad. Here's what I've got in my root view controller's viewDidLoad: //Code to detect if an external display is connected to the iPad. NSLog(@"Number of screens: %d", [[UIScreen screens]count]); //Now, if there's an external screen, we need to find its modes, itereate through them and find the highest one. Once we have that mode, break out, and set the UIWindow. if([[UIScreen screens]count] > 1) //if there are more than 1 screens connected to the device { CGSize max; UIScreenMode *maxScreenMode; for(int i = 0; i < [[[[UIScreen screens] objectAtIndex:1] availableModes]count]; i++) { UIScreenMode *current = [[[[UIScreen screens]objectAtIndex:1]availableModes]objectAtIndex:i]; if(current.size.width > max.width); { max = current.size; maxScreenMode = current; } } //Now we have the highest mode. Turn the external display to use that mode. UIScreen *external = [[UIScreen screens] objectAtIndex:1]; external.currentMode = maxScreenMode; //Boom! Now the external display is set to the proper mode. We need to now set the screen of a new UIWindow to the external screen external_disp = [externalDisplay alloc]; external_disp.drawImage = drawViewController.drawImage; UIWindow *newwindow = [UIWindow alloc]; [newwindow addSubview:external_disp.view]; newwindow.screen = external; }

    Read the article

  • FLTK Images (Using Cmake)

    - by Cenoc
    I'm trying to load and manipulate a jpeg file using FLTK compiled using cmake... but I get the following errors: 2LINK : warning LNK4098: defaultlib 'MSVCRT' conflicts with use of other libs; use /NODEFAULTLIB:library 2fltkimages.lib(Fl_JPEG_Image.obj) : error LNK2019: unresolved external symbol _jpeg_read_scanlines referenced in function "public: __thiscall Fl_JPEG_Image::Fl_JPEG_Image(char const *)" (??0Fl_JPEG_Image@@QAE@PBD@Z) 2fltkimages.lib(Fl_JPEG_Image.obj) : error LNK2019: unresolved external symbol _jpeg_start_decompress referenced in function "public: __thiscall Fl_JPEG_Image::Fl_JPEG_Image(char const *)" (??0Fl_JPEG_Image@@QAE@PBD@Z) 2fltkimages.lib(Fl_JPEG_Image.obj) : error LNK2019: unresolved external symbol _jpeg_calc_output_dimensions referenced in function "public: __thiscall Fl_JPEG_Image::Fl_JPEG_Image(char const *)" (??0Fl_JPEG_Image@@QAE@PBD@Z) 2fltkimages.lib(Fl_JPEG_Image.obj) : error LNK2019: unresolved external symbol _jpeg_read_header referenced in function "public: __thiscall Fl_JPEG_Image::Fl_JPEG_Image(char const *)" (??0Fl_JPEG_Image@@QAE@PBD@Z) 2fltkimages.lib(Fl_JPEG_Image.obj) : error LNK2019: unresolved external symbol _jpeg_stdio_src referenced in function "public: __thiscall Fl_JPEG_Image::Fl_JPEG_Image(char const *)" (??0Fl_JPEG_Image@@QAE@PBD@Z) 2fltkimages.lib(Fl_JPEG_Image.obj) : error LNK2019: unresolved external symbol _jpeg_CreateDecompress referenced in function "public: __thiscall Fl_JPEG_Image::Fl_JPEG_Image(char const *)" (??0Fl_JPEG_Image@@QAE@PBD@Z) 2fltkimages.lib(Fl_JPEG_Image.obj) : error LNK2019: unresolved external symbol _jpeg_destroy_decompress referenced in function "public: __thiscall Fl_JPEG_Image::Fl_JPEG_Image(char const *)" (??0Fl_JPEG_Image@@QAE@PBD@Z) 2fltkimages.lib(Fl_JPEG_Image.obj) : error LNK2019: unresolved external symbol _jpeg_finish_decompress referenced in function "public: __thiscall Fl_JPEG_Image::Fl_JPEG_Image(char const *)" (??0Fl_JPEG_Image@@QAE@PBD@Z) I hate these linking errors..... can anyone help? Thank you in advance.

    Read the article

  • Is there any way to automatically break into debugger when my class library functions are getting ca

    - by mishal153
    I have a managed class library (say mylib.dll) and a 3rd party managed app (say app.exe) which is using mylib.dll. I have the code of mylib.dll but not of the app.exe. So currently what i do is i build mylib.dll, copy it to app.exe's directory, start app.exe and attach to the process. That way if i put breakpoints in code mylib.dll , i see them being hit. But is there anyway to automatically break in code of mylib.dll whenever any external application calls one of its exposed methods ? ie. Only for entrypoints of the dll. thanks, Mishal

    Read the article

  • How do I get a full Magento session in an external script? (Specifically, Catalog Rules)

    - by Laizer
    I'm running an external script that loads up a Magento session. Within that script, I'm loading products and grabbing a bunch of properties. The one issue is that getFinalPrice() does not apply the catalog rules that apply to the product. I'm doing everything I know to set the session, even a bunch of stuff that I think is superfluous. Nothing seems to get these rules applied. Here's a test script: require_once "app/Mage.php"; umask(0); $app = Mage::app("default"); $app->getTranslator()->init('frontend'); //Probably not needed Mage::getSingleton('core/session', array('name'=>'frontend')); $session = Mage::getSingleton("customer/session"); $session->start(); //Probably not needed $session->loginById(122); $product = Mage::getModel('catalog/product')->load(1429); echo $product->getFinalPrice(); Any insight is appreciated.

    Read the article

  • how to load external xml using air application for flash programmer ?

    - by Ayman
    hi, i have faced this problem couple of days ago, while trying to import an external xml file into an AIR application. import flash.net.URLRequest; var ldr:Loader = new Loader(); var url:String = "http://willperone.net/rss.php"; var urlReq:URLRequest = new URLRequest(url); ldr.load(urlReq); ldr.addEventListener(Event.COMPLETE , function(e) { trace('Wow, completed ...'); }); ldr.contentLoaderInfo.addEventListener(IOErrorEvent.IO_ERROR, function(e) { trace('IO_ERROR'); }); and always the IO_ERROR shows up. may i do it wrong or something needs a little of configuration, so please help

    Read the article

  • Can I use "Online Backup" to backup my DVS instead of pushing to an external repo?

    - by Matt Brailsford
    Hi Guys, I'm currently signed up with a third party service that hosts my mercurial repositories as a central hub to push my changes to as a sort of backup. Now, I'm looking at a system to backup my laptop and am concidering Mozy. I'm a loan developer, and work on a laptop and am usualy connected to my internet via wifi with my laptop only really being on when I'm working, so feel something like Mozy is my best option. My question is, if I'm the only developer, could I get away with just using local mercurial repos and using Mozy to backup everything up? Rather than pushing to an external repo? Many thanks Matt

    Read the article

  • Sample source code for processing messages of a window created by an external program?

    - by David
    I know I have to use SetWindowLongPtr with GWLP_WNDPROC and create my own WndProc that handles the message I want (such as WM_GETMINMAXINFO and modify the MINMAXINFO structure). However, because I want to do this for a window created by another program (like notepad.exe), I can't do this from my C#/WinForms program, I have to create a native C/C++ DLL that I have to inject in the the process that created the window. Can you provide a link or the sample code to do this (the native C++ DLL and the way to call it from C# and inject it into the external process)? Thank you

    Read the article

  • Why do external Java libraries paths have to be refreshed in eclipse between starts?

    - by Jason
    For my school projects, I use the joda-time API for generating timestamps that are used for file names and logging. Getting eclipse to recognize the library is no problem. However, when I start eclipse between reboots, I get the red X in the project tree because the external API cannot be found, even though the file path has not changed. So each time, I have to go to the Libraries tab in Build Path to re-target the API. Frankly, this is getting to be a PIMA. So is there any way to have the path be permanent, so I don't have to do the Build Path rigmarole all the time?

    Read the article

  • Eclipse hangs when rebuilding after the addition of an external JAR file.

    - by celestialorb
    I'm fairly new to Eclipse so if this is something simple I apologize, however when I attempt to add an external JAR file to my build path (specifically the "rt.jar" file which contains certain tools that I require) and then rebuild my project, Eclipse will hang at the end of the Build process. It'll get to 100% then just hang there using 100% of one of my CPU cores. At first I thought it may have been due to the relatively large size of the rt.jar file, but I tried using smaller JAR files and it still hung at 100%. Any help would be greatly appreciated! If there is something wrong with using the rt.jar file does anyone know of another JAR file that contains both tools for dealing with SOAP requests as well as XML/DOM manipulation? Thanks again!

    Read the article

  • Running external php files or snippets with starting session in Modx?

    - by moogeek
    I want to include an external php script or modx snippet to the index.php but it causes the blank screen instead (and no document parser errors). Probably the problem is that this script i want to include contains starting session functions and set_include_path function that might somehow conflict with Modx parser.. I tried to use the Modx API but it doesn't seem to work. I use Modx 0.9.2.6 yet.. How can I overcome the issue? My script checks the session and database if the user is logged-in on the site (logging system is not modx-based) and then prints the menu depends on the user privileges...

    Read the article

  • Tool to automate converting inline css to external css?

    - by Tony_Henrich
    I am working on a site which is full of inline css. Is there a tool to automatically refactor the pages so that inline css is moved to an external css file? Preferably doing this in a smart way where it doesn't create duplicate css declarations so if I have two inline css like style="left-padding; 12px", it creates one css class instead of two? Adobe Dreamweaver can do this manually one inline css at a time. I prefer a tool to do all the work in one shot.

    Read the article

  • How to perform an external request in Kohana 3?

    - by alex
    I've always used cURL for this sort of stuff, but this article got me thinking I could request another page easily using the Request object in Kohana 3. $url = 'http://www.example.com'; $update = Request::factory($url); $update->method = 'POST'; $update->post = array( 'key' => 'value' ); $update->execute(); echo $update->response; However I get the error Accessing static property Request::$method as non static From this I can assume it means that the method method is static, but that doesn't help me much. I also copied and pasted the example from that article and it threw the same error. Basically, I'm trying to POST to a new page on an external server, and do it the Kohana way. So, am I doing this correctly, or should I just use cURL (or file_get_contents() with context)?

    Read the article

  • How do I view Visual Studio BuildLog.htm files without cutting and pasting into an external browser

    - by bgoodr
    This may or may not be specific to VS2005 (as that is the version I'm referring to for this question). I find often the case is that I see this in the Output panel inside Visual Studio 2>Build log was saved at "file://c:\\vsdll_example\MyExecRefsDll\Debug\BuildLog.htm" Now, since that looks and smells like a URL, I would have thought that I could simply left mouse click on it, or left mouse double-click on it, and a browser window of some sort would be displayed. No, that doesn't work. So, to view it, I have to cut and paste the "file://bla/bla/bla" part into an external window. Is there a way to set up Visual Studio to allow me to browse to that file directly, or view it inside Visual Studio IDE, or something to that effect, without the extra fiddling with cutting and pasting? Or is there some type of keybinding I'm not aware of? Thanks, bg

    Read the article

  • How do I move the location of an xcodeproj file without breaking external build target?

    - by petFoo
    I have an Xcode project with a directory structure like this: MasterProjectDir/projectname.xcodeproj MasterProjectDir/ProjectSubDir/whatever.c MasterProjectDir/ProjectSubDir/etc.c MasterProjectDir/ProjectSubDir/Makefile My xcodeproj uses an external build target to point to the Makefile using these settings: Build Tool: /usr/bin/make Arguments: $(ACTION) Directory: ./ProjectSubDir For various reasons, I need to change the project directory structure to look like this: MasterProjectDir/projectname.xcodeproj MasterProjectDir/whatever.c MasterProjectDir/etc.c MasterProjectDir/Makefile I copied the .xcodeproj file into the ProjectSubDir and the project somehow still knows where to look for the files (?!?! - this is odd because their location is set as "relative to group" and I've just moved the xcodeproj file). It won't build. I get the following error: make: * No targets specified and no makefile found. Stop. Command /usr/bin/make failed with exit code 2 I could use a little help on this. There must be a setting I need to change somewhere.

    Read the article

  • Facebook page linking to external site sign-up process, capture permission to write to wall in process?

    - by steve
    Hi all, Have had a good hunt through the archive but can't find anyone trying to do this... hope someone familiar with the facebook API can confirm if it's possible? Basically I have a client who wants to replicate their membership sign up process in a tab on their facebook page. The form would still submit to their own website to process, we'd just be replicating the form fields. As an additional requirement they want to capture peoples facebook user ID and get permission to post back to a users wall at the same time... The idea being that once the user is a member we can post back to their wall so their friends see that they've signed up... Basically after a sanity check that: 1) these things are possible to do; 2) the best method to build the form in a FB page - I'm guessing using JS to create all fields & ajax to submit to the external site? Thanks Steve

    Read the article

  • How to peform an external request in Kohana 3?

    - by alex
    I've always used cURL for this sort of stuff, but this article got me thinking I could request another page easily using the Request object in Kohana 3. $url = 'http://www.example.com'; $update = Request::factory($url); $update->method = 'POST'; $update->post = array( 'key' => 'value' ); $update->execute(); echo $update->response; However I get the error Accessing static property Request::$method as non static From this I can assume it means that the method method is static, but that doesn't help me much. I also copied and pasted the example from that article and it threw the same error. Basically, I'm trying to POST to a new page on an external server, and do it the Kohana way. So, am I doing this correctly, or should I just use cURL (or file_get_contents() with context)?

    Read the article

  • Javascript: why I cannot load my external js page in Drupal ?

    - by Patrick
    hi, when I add the tag script to load an external javascript file, my page is not longer displayed. There are not error in Firebug, there are not errors such as "File not found" or "Not enough permissions", the browser just displays a blank page for some reason. <?php print $head; ?> <?php print $styles; ?> <?php print $scripts; ?> <script type="text/JavaScript" src="main.js" /> If I remove the last line everything works perfectly. The previous php lines are the standard Drupal head lines. This is the content of my js file: $(document).ready( function() { alert("hello"); }); thanks

    Read the article

  • Can I ask ANT to look into .classpath for external jars?

    - by kunjaan
    Right now I have <!-- Classpath declaration --> <path id="project.classpath"> <fileset dir="${lib.dir}"> <include name="**/*.jar" /> <include name="**/*.zip" /> </fileset> </path> <!-- Compile Java source --> <target name="compile" depends="clean"> <mkdir dir="${build.dir}" /> <javac srcdir="${src.java.dir}" destdir="${build.dir}" nowarn="on"> <classpath refid="project.classpath" /> </javac> </target> Is there someway I can tell ANT to look into the eclipse's .classpath and figure out the external jars?

    Read the article

  • How do I produce an external URL as part of a replace_html call in Ruby on Rails?

    - by vlasits
    Basically, I am attempting to render an external website (the url of which is stored in the database) into a page in my Ruby on Rails app. I have a field in my model 'search' called 'search' that contains web addresses with the form 'www.example.com' or 'example.com'. I am trying to use a link_to_function call with replace_html to replace the 'maincontent' div with an iframe tag using the value of 'search' in the current instance as the src for the tag. My current attempt is the very ugly code below. I'd be grateful for either of the following types of responses: How can I rewrite the concatenation string to work correctly? How can I get the same effect (replacing the current content of the "mainContent" div with an iframe tag using a different method? (I had to modify the code before to remove the < from the iframe) link_to_function h(search.title) do |page| page.replace_html 'mainContent', 'iframe id="embedded" src="http://" + #{search.search} />' end

    Read the article

  • How to set up a different context to point to an external directory outside webapps Tomcat/Java

    - by pinkb
    Hi Folks, I am successful to map an external directory by creating an xml file like : <Context path="/uploads" docBase="C:\uploads\photos" crossContext="true"/> And I named this xml file as uploads.xml and saved under "#Tomcat\conf\Catalina\localhost" here # = Directory where Tomcat has been installed. And when I start Tomcat(5) from cammand line (batch file) i.e. startup.bat The images can be accessed normally like "http://localhost:8080/uploads/user1.png" It works. Actually I am using IntelliJ Idea 8 for devevelopment. When I start Tomcat from IntelliJ Idea, I am not able to access the context i.e. the images. "http://localhost:8080/uploads/user1.png" It shows "HTTP 400 Bad Request" The context path for my project is "http://localhost:8080/spark/" Any help or suggestion is needed at the earliest time. Looking forward to as many appreciative responses as possible. Thanx Pink

    Read the article

  • How to add types from external assembly to toolbox control? (WPF)

    - by Louis Rhys
    I am trying to do something like this in my WPF application: ToolboxControl ctrl = new ToolboxControl(); Assembly assembly = Assembly.LoadFile(file); var category = new ToolboxCategory(assembly.GetName().Name); foreach (Type t in assembly.GetTypes()) { var wrapper = new ToolboxItemWrapper(t, t.Name); category.Add(wrapper); } ctrl.Categories.Add(category); i.e. adding ToolboxItemWrappers for each type found in an assembly. However the last line throws the following exception (see image) All dependencies of the external assembly are also referenced in the main (WPF) application. So what's wrong here and how to fix it?

    Read the article

  • How to use mod_rewrite to change external incoming images to local images?

    - by STRiDOR
    Hi, I'm trying to figure out how to use mod_rewrite so that I can replace linked images (coming in externally) and use local ones instead. Why am I doing this? I have a plugin which I'm integrating into my site, which uses ugly external images as buttons, and I want to redo these buttons to match my site. The links come in externally and are not embedded in a plugin php somewhere, so I figure there might be some way of using mod_rewrite to intercept and replace the incoming links. I hope someone can help, thanks!

    Read the article

  • Microsoft and jQuery

    - by Rick Strahl
    The jQuery JavaScript library has been steadily getting more popular and with recent developments from Microsoft, jQuery is also getting ever more exposure on the ASP.NET platform including now directly from Microsoft. jQuery is a light weight, open source DOM manipulation library for JavaScript that has changed how many developers think about JavaScript. You can download it and find more information on jQuery on www.jquery.com. For me jQuery has had a huge impact on how I develop Web applications and was probably the main reason I went from dreading to do JavaScript development to actually looking forward to implementing client side JavaScript functionality. It has also had a profound impact on my JavaScript skill level for me by seeing how the library accomplishes things (and often reviewing the terse but excellent source code). jQuery made an uncomfortable development platform (JavaScript + DOM) a joy to work on. Although jQuery is by no means the only JavaScript library out there, its ease of use, small size, huge community of plug-ins and pure usefulness has made it easily the most popular JavaScript library available today. As a long time jQuery user, I’ve been excited to see the developments from Microsoft that are bringing jQuery to more ASP.NET developers and providing more integration with jQuery for ASP.NET’s core features rather than relying on the ASP.NET AJAX library. Microsoft and jQuery – making Friends jQuery is an open source project but in the last couple of years Microsoft has really thrown its weight behind supporting this open source library as a supported component on the Microsoft platform. When I say supported I literally mean supported: Microsoft now offers actual tech support for jQuery as part of their Product Support Services (PSS) as jQuery integration has become part of several of the ASP.NET toolkits and ships in several of the default Web project templates in Visual Studio 2010. The ASP.NET MVC 3 framework (still in Beta) also uses jQuery for a variety of client side support features including client side validation and we can look forward toward more integration of client side functionality via jQuery in both MVC and WebForms in the future. In other words jQuery is becoming an optional but included component of the ASP.NET platform. PSS support means that support staff will answer jQuery related support questions as part of any support incidents related to ASP.NET which provides some piece of mind to some corporate development shops that require end to end support from Microsoft. In addition to including jQuery and supporting it, Microsoft has also been getting involved in providing development resources for extending jQuery’s functionality via plug-ins. Microsoft’s last version of the Microsoft Ajax Library – which is the successor to the native ASP.NET AJAX Library – included some really cool functionality for client templates, databinding and localization. As it turns out Microsoft has rebuilt most of that functionality using jQuery as the base API and provided jQuery plug-ins of these components. Very recently these three plug-ins were submitted and have been approved for inclusion in the official jQuery plug-in repository and been taken over by the jQuery team for further improvements and maintenance. Even more surprising: The jQuery-templates component has actually been approved for inclusion in the next major update of the jQuery core in jQuery V1.5, which means it will become a native feature that doesn’t require additional script files to be loaded. Imagine this – an open source contribution from Microsoft that has been accepted into a major open source project for a core feature improvement. Microsoft has come a long way indeed! What the Microsoft Involvement with jQuery means to you For Microsoft jQuery support is a strategic decision that affects their direction in client side development, but nothing stopped you from using jQuery in your applications prior to Microsoft’s official backing and in fact a large chunk of developers did so readily prior to Microsoft’s announcement. Official support from Microsoft brings a few benefits to developers however. jQuery support in Visual Studio 2010 means built-in support for jQuery IntelliSense, automatically added jQuery scripts in many projects types and a common base for client side functionality that actually uses what most developers are already using. If you have already been using jQuery and were worried about straying from the Microsoft line and their internal Microsoft Ajax Library – worry no more. With official support and the change in direction towards jQuery Microsoft is now following along what most in the ASP.NET community had already been doing by using jQuery, which is likely the reason for Microsoft’s shift in direction in the first place. ASP.NET AJAX and the Microsoft AJAX Library weren’t bad technology – there was tons of useful functionality buried in these libraries. However, these libraries never got off the ground, mainly because early incarnations were squarely aimed at control/component developers rather than application developers. For all the functionality that these controls provided for control developers they lacked in useful and easily usable application developer functionality that was easily accessible in day to day client side development. The result was that even though Microsoft shipped support for these tools in the box (in .NET 3.5 and 4.0), other than for the internal support in ASP.NET for things like the UpdatePanel and the ASP.NET AJAX Control Toolkit as well as some third party vendors, the Microsoft client libraries were largely ignored by the developer community opening the door for other client side solutions. Microsoft seems to be acknowledging developer choice in this case: Many more developers were going down the jQuery path rather than using the Microsoft built libraries and there seems to be little sense in continuing development of a technology that largely goes unused by the majority of developers. Kudos for Microsoft for recognizing this and gracefully changing directions. Note that even though there will be no further development in the Microsoft client libraries they will continue to be supported so if you’re using them in your applications there’s no reason to start running for the exit in a panic and start re-writing everything with jQuery. Although that might be a reasonable choice in some cases, jQuery and the Microsoft libraries work well side by side so that you can leave existing solutions untouched even as you enhance them with jQuery. The Microsoft jQuery Plug-ins – Solid Core Features One of the most interesting developments in Microsoft’s embracing of jQuery is that Microsoft has started contributing to jQuery via standard mechanism set for jQuery developers: By submitting plug-ins. Microsoft took some of the nicest new features of the unpublished Microsoft Ajax Client Library and re-wrote these components for jQuery and then submitted them as plug-ins to the jQuery plug-in repository. Accepted plug-ins get taken over by the jQuery team and that’s exactly what happened with the three plug-ins submitted by Microsoft with the templating plug-in even getting slated to be published as part of the jQuery core in the next major release (1.5). The following plug-ins are provided by Microsoft: jQuery Templates – a client side template rendering engine jQuery Data Link – a client side databinder that can synchronize changes without code jQuery Globalization – provides formatting and conversion features for dates and numbers The first two are ports of functionality that was slated for the Microsoft Ajax Library while functionality for the globalization library provides functionality that was already found in the original ASP.NET AJAX library. To me all three plug-ins address a pressing need in client side applications and provide functionality I’ve previously used in other incarnations, but with more complete implementations. Let’s take a close look at these plug-ins. jQuery Templates http://api.jquery.com/category/plugins/templates/ Client side templating is a key component for building rich JavaScript applications in the browser. Templating on the client lets you avoid from manually creating markup by creating DOM nodes and injecting them individually into the document via code. Rather you can create markup templates – similar to the way you create classic ASP server markup – and merge data into these templates to render HTML which you can then inject into the document or replace existing content with. Output from templates are rendered as a jQuery matched set and can then be easily inserted into the document as needed. Templating is key to minimize client side code and reduce repeated code for rendering logic. Instead a single template can be used in many places for updating and adding content to existing pages. Further if you build pure AJAX interfaces that rely entirely on client rendering of the initial page content, templates allow you to a use a single markup template to handle all rendering of each specific HTML section/element. I’ve used a number of different client rendering template engines with jQuery in the past including jTemplates (a PHP style templating engine) and a modified version of John Resig’s MicroTemplating engine which I built into my own set of libraries because it’s such a commonly used feature in my client side applications. jQuery templates adds a much richer templating model that allows for sub-templates and access to the data items. Like John Resig’s original Micro Template engine, the core basics of the templating engine create JavaScript code which means that templates can include JavaScript code. To give you a basic idea of how templates work imagine I have an application that downloads a set of stock quotes based on a symbol list then displays them in the document. To do this you can create an ‘item’ template that describes how each of the quotes is renderd as a template inside of the document: <script id="stockTemplate" type="text/x-jquery-tmpl"> <div id="divStockQuote" class="errordisplay" style="width: 500px;"> <div class="label">Company:</div><div><b>${Company}(${Symbol})</b></div> <div class="label">Last Price:</div><div>${LastPrice}</div> <div class="label">Net Change:</div><div> {{if NetChange > 0}} <b style="color:green" >${NetChange}</b> {{else}} <b style="color:red" >${NetChange}</b> {{/if}} </div> <div class="label">Last Update:</div><div>${LastQuoteTimeString}</div> </div> </script> The ‘template’ is little more than HTML with some markup expressions inside of it that define the template language. Notice the embedded ${} expressions which reference data from the quote objects returned from an AJAX call on the server. You can embed any JavaScript or value expression in these template expressions. There are also a number of structural commands like {{if}} and {{each}} that provide for rudimentary logic inside of your templates as well as commands ({{tmpl}} and {{wrap}}) for nesting templates. You can find more about the full set of markup expressions available in the documentation. To load up this data you can use code like the following: <script type="text/javascript"> //var Proxy = new ServiceProxy("../PageMethods/PageMethodsService.asmx/"); $(document).ready(function () { $("#btnGetQuotes").click(GetQuotes); }); function GetQuotes() { var symbols = $("#txtSymbols").val().split(","); $.ajax({ url: "../PageMethods/PageMethodsService.asmx/GetStockQuotes", data: JSON.stringify({ symbols: symbols }), // parameter map type: "POST", // data has to be POSTed contentType: "application/json", timeout: 10000, dataType: "json", success: function (result) { var quotes = result.d; var jEl = $("#stockTemplate").tmpl(quotes); $("#quoteDisplay").empty().append(jEl); }, error: function (xhr, status) { alert(status + "\r\n" + xhr.responseText); } }); }; </script> In this case an ASMX AJAX service is called to retrieve the stock quotes. The service returns an array of quote objects. The result is returned as an object with the .d property (in Microsoft service style) that returns the actual array of quotes. The template is applied with: var jEl = $("#stockTemplate").tmpl(quotes); which selects the template script tag and uses the .tmpl() function to apply the data to it. The result is a jQuery matched set of elements that can then be appended to the quote display element in the page. The template is merged against an array in this example. When the result is an array the template is automatically applied to each each array item. If you pass a single data item – like say a stock quote – the template works exactly the same way but is applied only once. Templates also have access to a $data item which provides the current data item and information about the tempalte that is currently executing. This makes it possible to keep context within the context of the template itself and also to pass context from a parent template to a child template which is very powerful. Templates can be evaluated by using the template selector and calling the .tmpl() function on the jQuery matched set as shown above or you can use the static $.tmpl() function to provide a template as a string. This allows you to dynamically create templates in code or – more likely – to load templates from the server via AJAX calls. In short there are options The above shows off some of the basics, but there’s much for functionality available in the template engine. Check the documentation link for more information and links to additional examples. The plug-in download also comes with a number of examples that demonstrate functionality. jQuery templates will become a native component in jQuery Core 1.5, so it’s definitely worthwhile checking out the engine today and get familiar with this interface. As much as I’m stoked about templating becoming part of the jQuery core because it’s such an integral part of many applications, there are also a couple shortcomings in the current incarnation: Lack of Error Handling Currently if you embed an expression that is invalid it’s simply not rendered. There’s no error rendered into the template nor do the various  template functions throw errors which leaves finding of bugs as a runtime exercise. I would like some mechanism – optional if possible – to be able to get error info of what is failing in a template when it’s rendered. No String Output Templates are always rendered into a jQuery matched set and there’s no way that I can see to directly render to a string. String output can be useful for debugging as well as opening up templating for creating non-HTML string output. Limited JavaScript Access Unlike John Resig’s original MicroTemplating Engine which was entirely based on JavaScript code generation these templates are limited to a few structured commands that can ‘execute’. There’s no code execution inside of script code which means you’re limited to calling expressions available in global objects or the data item passed in. This may or may not be a big deal depending on the complexity of your template logic. Error handling has been discussed quite a bit and it’s likely there will be some solution to that particualar issue by the time jQuery templates ship. The others are relatively minor issues but something to think about anyway. jQuery Data Link http://api.jquery.com/category/plugins/data-link/ jQuery Data Link provides the ability to do two-way data binding between input controls and an underlying object’s properties. The typical scenario is linking a textbox to a property of an object and have the object updated when the text in the textbox is changed and have the textbox change when the value in the object or the entire object changes. The plug-in also supports converter functions that can be applied to provide the conversion logic from string to some other value typically necessary for mapping things like textbox string input to say a number property and potentially applying additional formatting and calculations. In theory this sounds great, however in reality this plug-in has some serious usability issues. Using the plug-in you can do things like the following to bind data: person = { firstName: "rick", lastName: "strahl"}; $(document).ready( function() { // provide for two-way linking of inputs $("form").link(person); // bind to non-input elements explicitly $("#objFirst").link(person, { firstName: { name: "objFirst", convertBack: function (value, source, target) { $(target).text(value); } } }); $("#objLast").link(person, { lastName: { name: "objLast", convertBack: function (value, source, target) { $(target).text(value); } } }); }); This code hooks up two-way linking between a couple of textboxes on the page and the person object. The first line in the .ready() handler provides mapping of object to form field with the same field names as properties on the object. Note that .link() does NOT bind items into the textboxes when you call .link() – changes are mapped only when values change and you move out of the field. Strike one. The two following commands allow manual binding of values to specific DOM elements which is effectively a one-way bind. You specify the object and a then an explicit mapping where name is an ID in the document. The converter is required to explicitly assign the value to the element. Strike two. You can also detect changes to the underlying object and cause updates to the input elements bound. Unfortunately the syntax to do this is not very natural as you have to rely on the jQuery data object. To update an object’s properties and get change notification looks like this: function updateFirstName() { $(person).data("firstName", person.firstName + " (code updated)"); } This works fine in causing any linked fields to be updated. In the bindings above both the firstName input field and objFirst DOM element gets updated. But the syntax requires you to use a jQuery .data() call for each property change to ensure that the changes are tracked properly. Really? Sure you’re binding through multiple layers of abstraction now but how is that better than just manually assigning values? The code savings (if any) are going to be minimal. As much as I would like to have a WPF/Silverlight/Observable-like binding mechanism in client script, this plug-in doesn’t help much towards that goal in its current incarnation. While you can bind values, the ‘binder’ is too limited to be really useful. If initial values can’t be assigned from the mappings you’re going to end up duplicating work loading the data using some other mechanism. There’s no easy way to re-bind data with a different object altogether since updates trigger only through the .data members. Finally, any non-input elements have to be bound via code that’s fairly verbose and frankly may be more voluminous than what you might write by hand for manual binding and unbinding. Two way binding can be very useful but it has to be easy and most importantly natural. If it’s more work to hook up a binding than writing a couple of lines to do binding/unbinding this sort of thing helps very little in most scenarios. In talking to some of the developers the feature set for Data Link is not complete and they are still soliciting input for features and functionality. If you have ideas on how you want this feature to be more useful get involved and post your recommendations. As it stands, it looks to me like this component needs a lot of love to become useful. For this component to really provide value, bindings need to be able to be refreshed easily and work at the object level, not just the property level. It seems to me we would be much better served by a model binder object that can perform these binding/unbinding tasks in bulk rather than a tool where each link has to be mapped first. I also find the choice of creating a jQuery plug-in questionable – it seems a standalone object – albeit one that relies on the jQuery library – would provide a more intuitive interface than the current forcing of options onto a plug-in style interface. Out of the three Microsoft created components this is by far the least useful and least polished implementation at this point. jQuery Globalization http://github.com/jquery/jquery-global Globalization in JavaScript applications often gets short shrift and part of the reason for this is that natively in JavaScript there’s little support for formatting and parsing of numbers and dates. There are a number of JavaScript libraries out there that provide some support for globalization, but most are limited to a particular portion of globalization. As .NET developers we’re fairly spoiled by the richness of APIs provided in the framework and when dealing with client development one really notices the lack of these features. While you may not necessarily need to localize your application the globalization plug-in also helps with some basic tasks for non-localized applications: Dealing with formatting and parsing of dates and time values. Dates in particular are problematic in JavaScript as there are no formatters whatsoever except the .toString() method which outputs a verbose and next to useless long string. With the globalization plug-in you get a good chunk of the formatting and parsing functionality that the .NET framework provides on the server. You can write code like the following for example to format numbers and dates: var date = new Date(); var output = $.format(date, "MMM. dd, yy") + "\r\n" + $.format(date, "d") + "\r\n" + // 10/25/2010 $.format(1222.32213, "N2") + "\r\n" + $.format(1222.33, "c") + "\r\n"; alert(output); This becomes even more useful if you combine it with templates which can also include any JavaScript expressions. Assuming the globalization plug-in is loaded you can create template expressions that use the $.format function. Here’s the template I used earlier for the stock quote again with a couple of formats applied: <script id="stockTemplate" type="text/x-jquery-tmpl"> <div id="divStockQuote" class="errordisplay" style="width: 500px;"> <div class="label">Company:</div><div><b>${Company}(${Symbol})</b></div> <div class="label">Last Price:</div> <div>${$.format(LastPrice,"N2")}</div> <div class="label">Net Change:</div><div> {{if NetChange > 0}} <b style="color:green" >${NetChange}</b> {{else}} <b style="color:red" >${NetChange}</b> {{/if}} </div> <div class="label">Last Update:</div> <div>${$.format(LastQuoteTime,"MMM dd, yyyy")}</div> </div> </script> There are also parsing methods that can parse dates and numbers from strings into numbers easily: alert($.parseDate("25.10.2010")); alert($.parseInt("12.222")); // de-DE uses . for thousands separators As you can see culture specific options are taken into account when parsing. The globalization plugin provides rich support for a variety of locales: Get a list of all available cultures Query cultures for culture items (like currency symbol, separators etc.) Localized string names for all calendar related items (days of week, months) Generated off of .NET’s supported locales In short you get much of the same functionality that you already might be using in .NET on the server side. The plugin includes a huge number of locales and an Globalization.all.min.js file that contains the text defaults for each of these locales as well as small locale specific script files that define each of the locale specific settings. It’s highly recommended that you NOT use the huge globalization file that includes all locales, but rather add script references to only those languages you explicitly care about. Overall this plug-in is a welcome helper. Even if you use it with a single locale (like en-US) and do no other localization, you’ll gain solid support for number and date formatting which is a vital feature of many applications. Changes for Microsoft It’s good to see Microsoft coming out of its shell and away from the ‘not-built-here’ mentality that has been so pervasive in the past. It’s especially good to see it applied to jQuery – a technology that has stood in drastic contrast to Microsoft’s own internal efforts in terms of design, usage model and… popularity. It’s great to see that Microsoft is paying attention to what customers prefer to use and supporting the customer sentiment – even if it meant drastically changing course of policy and moving into a more open and sharing environment in the process. The additional jQuery support that has been introduced in the last two years certainly has made lives easier for many developers on the ASP.NET platform. It’s also nice to see Microsoft submitting proposals through the standard jQuery process of plug-ins and getting accepted for various very useful projects. Certainly the jQuery Templates plug-in is going to be very useful to many especially since it will be baked into the jQuery core in jQuery 1.5. I hope we see more of this type of involvement from Microsoft in the future. Kudos!© Rick Strahl, West Wind Technologies, 2005-2010Posted in jQuery  ASP.NET  

    Read the article

< Previous Page | 175 176 177 178 179 180 181 182 183 184 185 186  | Next Page >