Search Results

Search found 3177 results on 128 pages for 'david kaczynski'.

Page 124/128 | < Previous Page | 120 121 122 123 124 125 126 127 128  | Next Page >

  • L2TP connection fails!

    - by a.toraby
    I've installed l2tp-ipsec-vpn but when I try to connect to the vpn server I get error 500. Here are the logs: Jun 17 12:54:37.449 ipsec_setup: Stopping Openswan IPsec... Jun 17 12:54:38.858 Stopping xl2tpd: xl2tpd. Jun 17 12:54:38.859 xl2tpd[1511]: death_handler: Fatal signal 15 received Jun 17 12:54:38.872 ipsec_setup: Starting Openswan IPsec U2.6.37/K3.2.0-23-generic... Jun 17 12:54:39.027 ipsec__plutorun: Starting Pluto subsystem... Jun 17 12:54:39.033 ipsec__plutorun: adjusting ipsec.d to /etc/ipsec.d Jun 17 12:54:39.037 recvref[30]: Protocol not available Jun 17 12:54:39.038 xl2tpd[2442]: This binary does not support kernel L2TP. Jun 17 12:54:39.038 xl2tpd[2444]: xl2tpd version xl2tpd-1.3.1 started on atp-ThinkPad-SL410 PID:2444 Jun 17 12:54:39.038 xl2tpd[2444]: Written by Mark Spencer, Copyright (C) 1998, Adtran, Inc. Jun 17 12:54:39.038 xl2tpd[2444]: Forked by Scott Balmos and David Stipp, (C) 2001 Jun 17 12:54:39.038 xl2tpd[2444]: Inherited by Jeff McAdams, (C) 2002 Jun 17 12:54:39.039 xl2tpd[2444]: Forked again by Xelerance (www.xelerance.com) (C) 2006 Jun 17 12:54:39.039 xl2tpd[2444]: Listening on IP address 0.0.0.0, port 1701 Jun 17 12:54:39.040 Starting xl2tpd: xl2tpd. Jun 17 12:54:39.062 ipsec__plutorun: 002 added connection description "L2TP" Jun 17 12:55:30.753 104 "L2TP" #1: STATE_MAIN_I1: initiate Jun 17 12:55:30.754 010 "L2TP" #1: STATE_MAIN_I1: retransmission; will wait 20s for response Jun 17 12:55:30.754 010 "L2TP" #1: STATE_MAIN_I1: retransmission; will wait 40s for response Jun 17 12:55:30.754 003 "L2TP" #1: ignoring Vendor ID payload [MS NT5 ISAKMPOAKLEY 00000008] Jun 17 12:55:30.754 003 "L2TP" #1: received Vendor ID payload [RFC 3947] method set to=109 Jun 17 12:55:30.754 003 "L2TP" #1: received Vendor ID payload [draft-ietf-ipsec-nat-t-ike-02_n] meth=106, but already using method 109 Jun 17 12:55:30.755 003 "L2TP" #1: ignoring Vendor ID payload [FRAGMENTATION] Jun 17 12:55:30.755 003 "L2TP" #1: ignoring Vendor ID payload [MS-Negotiation Discovery Capable] Jun 17 12:55:30.755 003 "L2TP" #1: ignoring Vendor ID payload [IKE CGA version 1] Jun 17 12:55:30.755 106 "L2TP" #1: STATE_MAIN_I2: sent MI2, expecting MR2 Jun 17 12:55:30.755 010 "L2TP" #1: STATE_MAIN_I2: retransmission; will wait 20s for response Jun 17 12:55:30.755 003 "L2TP" #1: NAT-Traversal: Result using RFC 3947 (NAT-Traversal): i am NATed Jun 17 12:55:30.755 108 "L2TP" #1: STATE_MAIN_I3: sent MI3, expecting MR3 Jun 17 12:55:30.756 004 "L2TP" #1: STATE_MAIN_I4: ISAKMP SA established {auth=OAKLEY_PRESHARED_KEY cipher=oakley_3des_cbc_192 prf=oakley_sha group=modp1024} Jun 17 12:55:30.756 117 "L2TP" #2: STATE_QUICK_I1: initiate Jun 17 12:55:30.756 010 "L2TP" #2: STATE_QUICK_I1: retransmission; will wait 20s for response Jun 17 12:55:30.756 003 "L2TP" #2: ignoring informational payload, type IPSEC_RESPONDER_LIFETIME msgid=6b03ff69 Jun 17 12:55:30.756 003 "L2TP" #2: NAT-Traversal: received 2 NAT-OA. ignored because peer is not NATed Jun 17 12:55:30.756 003 "L2TP" #2: our client subnet returned doesn't match my proposal - us:192.168.1.3/32 vs them:109.162.174.235/32 Jun 17 12:55:30.757 003 "L2TP" #2: Allowing questionable proposal anyway [ALLOW_MICROSOFT_BAD_PROPOSAL] Jun 17 12:55:30.757 004 "L2TP" #2: STATE_QUICK_I2: sent QI2, IPsec SA established transport mode {ESP=>0x23af21f8 <0xdb4a87b6 xfrm=AES_128-HMAC_SHA1 NATOA=none NATD=none DPD=none} Jun 17 12:55:31.759 xl2tpd[2444]: Connecting to host x.x.x.x, port 1701 Jun 17 12:55:32.021 xl2tpd[2444]: Connection established to x.x.x.x, 1701. Local: 4720, Remote: 200 (ref=0/0). Jun 17 12:55:32.023 xl2tpd[2444]: Calling on tunnel 4720 Jun 17 12:55:32.454 xl2tpd[2444]: Call established with x.x.x.x, Local: 9667, Remote: 3, Serial: 1 (ref=0/0) Jun 17 12:55:32.456 xl2tpd[2444]: start_pppd: I'm running: Jun 17 12:55:32.456 xl2tpd[2444]: "/usr/sbin/pppd" Jun 17 12:55:32.457 xl2tpd[2444]: "passive" Jun 17 12:55:32.458 xl2tpd[2444]: "nodetach" Jun 17 12:55:32.458 xl2tpd[2444]: ":" Jun 17 12:55:32.459 xl2tpd[2444]: "file" Jun 17 12:55:32.459 xl2tpd[2444]: "/etc/ppp/L2TP.options.xl2tpd" Jun 17 12:55:32.460 xl2tpd[2444]: "ipparam" Jun 17 12:55:32.461 xl2tpd[2444]: "x.x.x.x" Jun 17 12:55:32.462 xl2tpd[2444]: "/dev/pts/1" Jun 17 12:55:32.583 pppd[2711]: Plugin passprompt.so loaded. Jun 17 12:55:32.583 pppd[2711]: pppd 2.4.5 started by root, uid 0 Jun 17 12:55:32.619 pppd[2711]: Using interface ppp0 Jun 17 12:55:32.620 pppd[2711]: Connect: ppp0 <--> /dev/pts/1 Jun 17 12:55:33.693 pppd[2711]: /usr/bin/L2tpIPsecVpn exited with code 0 Jun 17 12:55:34.454 [ERROR 404] Authentication failed: closing connection to 'L2TP' Jun 17 12:55:34.456 pppd[2711]: MS-CHAP authentication failed: E=691 Authentication failure Jun 17 12:55:34.457 pppd[2711]: CHAP authentication failed Jun 17 12:55:34.461 Stopping xl2tpd: xl2tpd. Jun 17 12:55:34.462 xl2tpd[2444]: death_handler: Fatal signal 15 received Jun 17 12:55:34.463 pppd[2711]: Modem hangup Jun 17 12:55:34.463 pppd[2711]: Connection terminated. Jun 17 12:55:34.474 ipsec_setup: Stopping Openswan IPsec... Jun 17 12:55:34.482 pppd[2711]: Exit. Jun 17 12:55:35.587 ipsec_setup: ERROR: Module xfrm4_mode_transport is in use Jun 17 12:55:35.665 ipsec_setup: ERROR: Module esp4 is in use I had this problem by ubuntu 11.10 though I can easily connect to the server from windows. I use ubuntu 12.0 64bit

    Read the article

  • ODI 12c's Mapping Designer - Combining Flow Based and Expression Based Mapping

    - by Madhu Nair
    post by David Allan ODI is renowned for its declarative designer and minimal expression based paradigm. The new ODI 12c release has extended this even further to provide an extended declarative mapping designer. The ODI 12c mapper is a fusion of ODI's new declarative designer with the familiar flow based designer while retaining ODI’s key differentiators of: Minimal expression based definition, The ability to incrementally design an interface and to extract/load data from any combination of sources, and most importantly Backed by ODI’s extensible knowledge module framework. The declarative nature of the product has been extended to include an extensible library of common components that can be used to easily build simple to complex data integration solutions. Big usability improvements through consistent interactions of components and concepts all constructed around the familiar knowledge module framework provide the utmost flexibility. Here is a little taster: So what is a mapping? A mapping comprises of a logical design and at least one physical design, it may have many. A mapping can have many targets, of any technology and can be arbitrarily complex. You can build reusable mappings and use them in other mappings or other reusable mappings. In the example below all of the information from an Oracle bonus table and a bonus file are joined with an Oracle employees table before being written to a target. Some things that are cool include the one-click expression cross referencing so you can easily see what's used where within the design. The logical design in a mapping describes what you want to accomplish  (see the animated GIF here illustrating how the above mapping was designed) . The physical design lets you configure how it is to be accomplished. So you could have one logical design that is realized as an initial load in one physical design and as an incremental load in another. In the physical design below we can customize how the mapping is accomplished by picking Knowledge Modules, in ODI 12c you can pick multiple nodes (on logical or physical) and see common properties. This is useful as we can quickly compare property values across objects - below we can see knowledge modules settings on the access points between execution units side by side, in the example one table is retrieved via database links and the other is an external table. In the logical design I had selected an append mode for the integration type, so by default the IKM on the target will choose the most suitable/default IKM - which in this case is an in-built Oracle Insert IKM (see image below). This supports insert and select hints for the Oracle database (the ANSI SQL Insert IKM does not support these), so by default you will get direct path inserts with Oracle on this statement. In ODI 12c, the mapper is just that, a mapper. Design your mapping, write to multiple targets, the targets can be in the same data server, in different data servers or in totally different technologies - it does not matter. ODI 12c will derive and generate a plan that you can use or customize with knowledge modules. Some of the use cases which are greatly simplified include multiple heterogeneous targets, multi target inserts for Oracle and writing of XML. Let's switch it up now and look at a slightly different example to illustrate expression reuse. In ODI you can define reusable expressions using user functions. These can be reused across mappings and the implementations specialized per technology. So you can have common expressions across Oracle, SQL Server, Hive etc. shielding the design from the physical aspects of the generated language. Another way to reuse is within a mapping itself. In ODI 12c expressions can be defined and reused within a mapping. Rather than replicating the expression text in larger expressions you can decompose into smaller snippets, below you can see UNIT_TAX AMOUNT has been defined and is used in two downstream target columns - its used in the TOTAL_TAX_AMOUNT plus its used in the UNIT_TAX_AMOUNT (a recording of the calculation).  You can see the columns that the expressions depend on (upstream) and the columns the expression is used in (downstream) highlighted within the mapper. Also multi selecting attributes is a convenient way to see what's being used where, below I have selected the TOTAL_TAX_AMOUNT in the target datastore and the UNIT_TAX_AMOUNT in UNIT_CALC. You can now see many expressions at once now and understand much more at the once time without needlessly clicking around and memorizing information. Our mantra during development was to keep it simple and make the tool more powerful and do even more for the user. The development team was a fusion of many teams from Oracle Warehouse Builder, Sunopsis and BEA Aqualogic, debating and perfecting the mapper in ODI 12c. This was quite a project from supporting the capabilities of ODI in 11g to building the flow based mapping tool to support the future. I hope this was a useful insight, there is so much more to come on this topic, this is just a preview of much more that you will see of the mapper in ODI 12c.

    Read the article

  • Test-Drive ASP.NET MVC Review

    - by Ben Griswold
    A few years back I started dallying with test-driven development, but I never fully committed to the practice. This wasn’t because I didn’t believe in the value of TDD; it was more a matter of not completely understanding how to incorporate “test first” into my everyday development. Back in my web forms days, I could point fingers at the framework for my ignorance and laziness. After all, web forms weren’t exactly designed for testability so who could blame me for not embracing TDD in those conditions, right? But when I switched to ASP.NET MVC and quickly found myself fresh out of excuses and it became instantly clear that it was time to get my head around red-green-refactor once and for all or I would regretfully miss out on one of the biggest selling points the new framework had to offer. I have previously written about how I learned ASP.NET MVC. It was primarily hands on learning but I did read a couple of ASP.NET MVC books along the way. The books I read dedicated a chapter or two to TDD and they certainly addressed the benefits of TDD and how MVC was designed with testability in mind, but TDD was merely an afterthought compared to, well, teaching one how to code the model, view and controller. This approach made some sense, and I learned a bunch about MVC from those books, but when it came to TDD the books were just a teaser and an opportunity missed.  But then I got lucky – Jonathan McCracken contacted me and asked if I’d review his book, Test-Drive ASP.NET MVC, and it was just what I needed to get over the TDD hump. As the title suggests, Test-Drive ASP.NET MVC takes a different approach to learning MVC as it focuses on testing right from the very start. McCracken wastes no time and swiftly familiarizes us with the framework by building out a trivial Quote-O-Matic application and then dedicates the better part of his book to testing first – first by explaining TDD and then coding a full-featured Getting Organized application inspired by David Allen’s popular book, Getting Things Done. If you are a learn-by-example kind of coder (like me), you will instantly appreciate and enjoy McCracken’s style – its fast-moving, pragmatic and focused on only the most relevant information required to get you going with ASP.NET MVC and TDD. The book continues with the test-first theme but McCracken moves away from the sample application and incorporates other practical skills like persisting models with NHibernate, leveraging Inversion of Control with the IControllerFactory and building a RESTful web service. What I most appreciated about this section was McCracken’s use of and praise for open source libraries like Rhino Mocks, SQLite and StructureMap (to name just a few) and productivity tools like ReSharper, Web Platform Installer and ASP.NET SQL Server Setup Wizard.  McCracken’s emphasis on real world, pragmatic development was clearly demonstrated in every tool choice, straight-forward code block and developer tip. Whether one is already familiar with the tools/tips or not, McCracken’s thought process is easily understood and appreciated. The final section of the book walks the reader through security and deployment – everything from error handling and logging with ELMAH, to ASP.NET Health Monitoring, to using MSBuild with automated builds, to the deployment  of ASP.NET MVC to various web environments. These chapters, like those prior, offer enough information and explanation to simply help you get the job done.  Do I believe Test-Drive ASP.NET MVC will turn you into an expert MVC developer overnight?  Well, no.  I don’t think any book can make that claim.  If that were possible, I think book list prices would skyrocket!  That said, Test-Drive ASP.NET MVC provides a solid foundation and a unique (and dare I say necessary) approach to learning ASP.NET MVC.  Along the way McCracken shares loads of very practical software development tips and references numerous tools and libraries. The bottom line is it’s a great ASP.NET MVC primer – if you’re new to ASP.NET MVC it’s just what you need to get started.  Do I believe Test-Drive ASP.NET MVC will give you everything you need to start employing TDD in your everyday development?  Well, I used to think that learning TDD required a lot of practice and, if you’re lucky enough, the guidance of a mentor or coach.  I used to think that one couldn’t learn TDD from a book alone. Well, I’m still no pro, but I’m testing first now and Jonathan McCracken and his book, Test-Drive ASP.NET MVC, played a big part in making this happen.  If you are an MVC developer and a TDD newb, Test-Drive ASP.NET MVC is just the book for you.

    Read the article

  • Copying Columns from Grid to Clipboard in SQL Developer

    - by thatjeffsmith
    There are several ways to get data from a query or a table|view to the clipboard. You know the tried and true, copy and paste. But what if you only want one or more columns, not every column? There are several ways to do this, let’s see if we can’t identify all of them. Write your query to only include the data you want Obvious? Yes. Needed to be said? Definitely. The best tuning tip is to only ask for the data you need, only when you absolutely need it. But let’s look at a few more practical ways to do this. Hide the unwanted columns Mouse right click on an column header. In the context menu, select ‘Columns.’ Hide the columns you don’t want. Copy and paste. WYSIWYG Grids, Hide Columns and Filter Rows Mouse select the columns Obvious, but a bit painful. For a very large dataset, you’ll be holding down the Shift and PageDown buttons – but it works. Remember to use Ctrl+Shift+C to get the column headers with the data. Use the Export Wizard This used to be called ‘Unload’ – agreed, not a great name. So, we changed it. In a grid, right mouse click on the data, and on the context menu, select ‘Export…’ Select your format – I suggest ‘delimited’ or ‘fixed’ for copying data to the clipboard. You can export to the clipboard, yes you can! Click ‘Next.’ Click in the Columns dialog, and choose the columns you want copied. Trim the columns you don't want copied Click ‘Finish.’ Alt or Ctrl tab to your window or application of choice. And Paste! "FIRST_NAME" "LAST_NAME" "Donald" "OConnell" "Douglas" "Grant" "Jennifer" "Whalen" "Pat" "Fay" "Susan" "Mavris" "William" "Gietz" "Alexander" "Hunold" "Bruce" "Ernst" "David" "Austin" "Valli" "Pataballa" "Diana" "Lorentz" "Daniel" "Faviet" "John" "Chen" "Ismael" "Sciarra" "Jose Manuel" "Urman" "Luis" "Popp" "Alexander" "Khoo" "Shelli" "Baida" "Sigal" "Tobias" "Guy" "Himuro" "Karen" "Colmenares" "Matthew" "Weiss" "Adam" "Fripp" "Payam" "Kaufling" "Shanta" "Vollman" "Kevin" "Mourgos" "Julia" "Nayer" "Irene" "Mikkilineni" ... There’s probably at least 2 or 3 more ways, but… But, try these and let me know how we can improve things. I’ve already gotten a request to be able to include the SQL text used to populate the dataset on the the copy to clipboard, and it’s now on our to-do list

    Read the article

  • Silverlight Firestarter Wrap Up and WCF RIA Services Talk Sample Code

    - by dwahlin
    I had a great time attending and speaking at the Silverlight Firestarter event up in Redmond on December 2, 2010. In addition to getting a chance to hang out with a lot of cool people from Microsoft such as Scott Guthrie, John Papa, Tim Heuer, Brian Goldfarb, John Allwright, David Pugmire, Jesse Liberty, Jeff Handley, Yavor Georgiev, Jossef Goldberg, Mike Cook and many others, I also had a chance to chat with a lot of people attending the event and hear about what projects they’re working on which was awesome. If you didn’t get a chance to look through all of the new features coming in Silverlight 5 check out John Papa’s post on the subject. While at the Silverlight Firestarter event I gave a presentation on WCF RIA Services and wanted to get the code posted since several people have asked when it’d be available. The talk can be viewed by clicking the image below. Code from the talk follows as well as additional links. I had a few people ask about the green bracelet on my left hand since it looks like something you’d get from a waterpark. It was used to get us access down a little hall that led backstage and allowed us to go backstage during the event. I thought it looked kind of dorky but it was required to get through security. Sample Code from My WCF RIA Services Talk (To login to the 2 apps use “user” and “P@ssw0rd”. Make sure to do a rebuild of the projects in Visual Studio before running them.) View All Silverlight Firestarter Talks and Scott Guthrie’s Keynote WCF RIA Services SP1 Beta for Silverlight 4 WCF RIA Services Code Samples (including some SP1 samples) Improved binding support in EntitySet and EntityCollection with SP1 (Kyle McClellan’s Blog) Introducing an MVVM-Friendly DomainDataSource: The DomainCollectionView (Kyle McClellan’s Blog) I’ve had the chance to speak at a lot of conferences but never with as many cameras, streaming capabilities, people watching live and overall hype involved. Over 1000 people registered to attend the conference in person at the Microsoft campus and well over 15,000 to watch it through the live stream.  The event started for me on Tuesday afternoon with a flight up to Seattle from Phoenix. My flight was delayed 1 1/2 hours (I seem to be good at booking delayed flights) so I didn’t get up there until almost 8 PM. John Papa did a tech check at 9 PM that night and I was scheduled for 9:30 PM. We basically plugged in my laptop backstage (amazing number of servers, racks and audio devices back there) and made sure everything showed up properly on the projector and the machines recording the presentation. In addition to a dedicated show director, there were at least 5 tech people back stage and at least that many up in the booth running lights, audio, cameras, and other aspects of the show. I wish I would’ve taken a picture of the backstage setup since it was pretty massive – servers all over the place. I definitely gained a new appreciation for how much work goes into these types of events. Here’s what the room looked like right before my tech check– not real exciting at this point. That’s Yavor Georgiev (who spoke on WCF Services at the Firestarter) in the background. We had plenty of monitors to reference during the presentation. Two monitors for slides (right and left side) and a notes monitor. The 4th monitor showed the time and they’d type in notes to us as we talked (such as “You’re over time!” in my case since I went around 4 minutes over :-)). Wednesday morning I went back on campus at Microsoft and watched John Papa film a few Silverlight TV episodes with Dave Campbell and Ryan Plemons.   Next I had the chance to watch the dry run of the keynote with Scott Guthrie and John Papa. We were all blown away by the demos shown since they were even better than expected. Starting at 1 PM on Wednesday I went over to Building 35 and listened to Yavor Georgiev (WCF Services), Jaime Rodriguez (Windows Phone 7), Jesse Liberty (Data Binding) and Jossef Goldberg and Mike Cook (Silverlight Performance) give their different talks and we all shared feedback with each other which was a lot of fun. Jeff Handley from the RIA Services team came afterwards and listened to me give a dry run of my WCF RIA Services talk. He had some great feedback that I really appreciated getting. That night I hung out with John Papa and Ward Bell and listened to John walk through his keynote demos. I also got a sneak peak of the gift given to Dave Campbell for all his work with Silverlight Cream over the years. It’s a poster signed by all of the key people involved with Silverlight: Thursday morning I got up fairly early to get to the event center by 8 AM for speaker pictures. It was nice and quiet at that point although outside the room there was a huge line of people waiting to get in.     At around 8:30 AM everyone was let in and the main room was filled quickly. Two other overflow rooms in the Microsoft conference center (Building 33) were also filled to capacity. At around 9 AM Scott Guthrie kicked off the event and all the excitement started! From there it was all a blur but it was definitely a lot of fun. All of the sessions for the Silverlight Firestarter were recorded and can be watched here (including the keynote). Corey Schuman, John Papa and I also released 11 lab exercises and associated videos to help people get started with Silverlight. Definitely check them out if you’re interested in learning more! Level 100: Getting Started Lab 01 - WinForms and Silverlight Lab 02 - ASP.NET and Silverlight Lab 03 - XAML and Controls Lab 04 - Data Binding Level 200: Ready for More Lab 05 - Migrating Apps to Out-of-Browser Lab 06 - Great UX with Blend Lab 07 - Web Services and Silverlight Lab 08 - Using WCF RIA Services Level 300: Take me Further Lab 09 - Deep Dive into Out-of-Browser Lab 10 - Silverlight Patterns: Using MVVM Lab 11 - Silverlight and Windows Phone 7

    Read the article

  • Visualising data a different way with Pivot collections

    - by Rob Farley
    Roger’s been doing a great job extending PivotViewer recently, and you can find the list of LobsterPot pivots at http://pivot.lobsterpot.com.au Many months back, the TED Talk that Gary Flake did about Pivot caught my imagination, and I did some research into it. At the time, most of what we did with Pivot was geared towards what we could do for clients, including making Pivot collections based on students at a school, and using it to browse PDF invoices by their various properties. We had actual commercial work based on Pivot collections back then, and it was all kinds of fun. Later, we made some collections for events that were happening, and even got featured in the TechEd Australia keynote. But I’m getting ahead of myself... let me explain the concept. A Pivot collection is an XML file (with .cxml extension) which lists Items, each linking to an image that’s stored in a Deep Zoom format (this means that it contains tiles like Bing Maps, so that the browser can request only the ones of interest according to the zoom level). This collection can be shown in a Silverlight application that uses the PivotViewer control, or in the Pivot Browser that’s available from getpivot.com. Filtering and sorting the items according to their facets (attributes, such as size, age, category, etc), the PivotViewer rearranges the way that these are shown in a very dynamic way. To quote Gary Flake, this lets us “see patterns which are otherwise hidden”. This browsing mechanism is very suited to a number of different methods, because it’s just that – browsing. It’s not searching, it’s more akin to window-shopping than doing an internet search. When we decided to put something together for the conferences such as TechEd Australia 2010 and the PASS Summit 2010, we did some screen-scraping to provide a different view of data that was already available online. Nick Hodge and Michael Kordahi from Microsoft liked the idea a lot, and after a bit of tweaking, we produced one that Michael used in the TechEd Australia keynote to show the variety of talks on offer. It’s interesting to see a pattern in this data: The Office track has the most sessions, but if the Interactive Sessions and Instructor-Led Labs are removed, it drops down to only the sixth most popular track, with Cloud Computing taking over. This is something which just isn’t obvious when you look an ordinary search tool. You get a much better feel for the data when moving around it like this. The more observant amongst you will have noticed some difference in the collection that Michael is demonstrating in the picture above with the screenshots I’ve shown. That’s because it’s been extended some more. At the SQLBits conference in the UK this year, I had some interesting discussions with the guys from Xpert360, particularly Phil Carter, who I’d met in 2009 at an earlier SQLBits conference. They had got around to producing a Pivot collection based on the SQLBits data, which we had been planning to do but ran out of time. We discussed some of ways that Pivot could be used, including the ways that my old friend Howard Dierking had extended it for the MSDN Magazine. I’m not suggesting I influenced Xpert360 at all, but they certainly inspired us with some of their posts on the matter So with LobsterPot guys David Gardiner and Roger Noble both having dabbled in Pivot collections (and Dave doing some for clients), I set Roger to work on extending it some more. He’s used various events and so on to be able to make an environment that allows us to do quick deployment of new collections, as well as showing the data in a grid view which behaves as if it were simply a third view of the data (the other two being the array of images and the ‘histogram’ view). I see PivotViewer as being a significant step in data visualisation – so much so that I feature it when I deliver talks on Spatial Data Visualisation methods. Any time when there is information that can be conveyed through an image, you have to ask yourself how best to show that image, and whether that image is the focal point. For Spatial data, the image is most often a map, and the map becomes the central mode for navigation. I show Pivot with postcode areas, since I can browse the postcodes based on their data, and many of the images are recognisable (to locals of South Australia). Naturally, the images could link through to the map itself, and so on, but generally people think of Spatial data in terms of navigating a map, which doesn’t always gel with the information you’re trying to extract. Roger’s even looking into ways to hook PivotViewer into the Bing Maps API, in a similar way to the Deep Earth project, displaying different levels of map detail according to how ‘zoomed in’ the images are. Some of the work that Dave did with one of the schools was generating the Deep Zoom tiles “on the fly”, based on images stored in a database, and Roger has produced a collection which uses images from flickr, that lets you move from one search term to another. Pulling the images down from flickr.com isn’t particularly ideal from a performance aspect, and flickr doesn’t store images in a small-enough format to really lend itself to this use, but you might agree that it’s an interesting concept which compares nicely to using Maps. I’m looking forward to future versions of the PivotViewer control, and hope they provide many more events that can be used, and even more hooks into it. Naturally, LobsterPot could help provide your business with a PivotViewer experience, but you can probably do a lot of it yourself too. There’s a thorough guide at getpivot.com, which is how we got into it. For some examples of what we’ve done, have a look at http://pivot.lobsterpot.com.au. I’d like to see PivotViewer really catch on a data visualisation tool.

    Read the article

  • top Tweets SOA Partner Community – May 2012

    - by JuergenKress
    Send your tweets @soacommunity #soacommunity and follow us at http://twitter.com/soacommunity SOA Community BPMN2.0 Oracle notations poster from eaiesb http://wp.me/p10C8u-pu Torsten WinterbergLook out for new Oracle #BPM edition coming up soon: The Oracle BPM Standard edtion! Great news for easy entry, small licence fees. Yes! Danilo Schmiedel Had a great chat with customer yesterday about #OracleBPM. Next step will be a 5day event combining modeling and implementation @soacommunity Frank Nimphius Still reading "Oracle Business Process Management Suite 11g Handbook". Excellent resource for a non-SOA but ADF guy like me ;-) Oracle New webcast: Maximize #Oracle #WebLogic Server ROI with Oracle #Enterprise #Manager 12c on May 2 at 10 am PT. Register http://bit.ly/JFUrR9 OTNArchBeat@OTNArchBeat BPM in Financial Services Industry | Sanjeev Sharma http://bit.ly/HCCxui JDeveloper & ADF BPEL 11.1.1.6 Certified for Prebuilt E-Business Suite 12.1.3 SOA Integrations http://dlvr.it/1V9SxR Oracle UPK & Tutor Collaborate Attendees: Visit the UPK demo pod, SIGS, and sessions: If you are attending Collaborate 2012 - Sun. http://bit.ly/J39z65 Heidi Buelow see #fmw track RT @demed: Are you going to #KSCOPE12 in San Antonio, June 24-28? http://kscope12.com/component/ seminar/seminarslist?topicsid=6 Use promo code Fusion for discount! Sabine Leitner #SIG #Middleware 15.05. Frankfurt #Oracle #DOAG Planung & Aufbau WebLogic Server #WLS http://bit.ly/HKsCWV @OracleWebLogic @soacommunity SOA Community MDS explorer by Red Samurai http://wp.me/p10C8u-pp Biemond &reg; Retrieve or set a HTTP header from Oracle BPEL: With Oracle SOA Suite 11g patch 12928372 you can finally retrie http://bit.ly/JejTHC Lucas Jellema Call for papers for UKOUG 2012 has opened: http://techandebs.ukoug.org /default.asp?p=9306 (deadline 1st of June) OTNArchBeat BPM API usage: List all BPM Processes for a user | Kavitha Srinivasan http://bit.ly/IJKVfj demed SOA, Cloud + Service Tech symposium (London, Sep 24-25) call for paper is open http://www.servicetechsymposium. com /call2012.php @techsymp #oraclesoa OracleBlogs Lessons learned configuring OER 11g Workflows http://ow.ly/1iMsKh OTNArchBeat Scripting WebLogic Admin Server Startup | Antony Reynolds http://bit.ly/IH5ciU orclateamsoa A-Team Blog #ateam: BPM API usage: List all BPM Processes for a user http://ow.ly/1iJADp Lucas Jellema Just blogged about our Live FMW Application Development show during OBUG 2012, next Tuesday 24th April in Maastricht: OracleBlogs OEG integration with OSB/OWSM - 11g http://ow.ly/1iKx7G SOA Community SOA Community Newsletter April 2012 http://wp.me/p10C8u-pl Frank DorstRT @whitehorsesnl: Whiteblog: BPM Process Spaces in Oracle Webcenter (Patch Set 5(http://bit.ly/Hxzh29) #soacommunity #bpm #oracle) David Shaffer The Advanced SOA suite training class next week in Redwood City is full! Learned a lot about accepting credit card payments. OTNArchBeat Running Built-In Test Simulator with SOA Suite Healthcare 11g in PS4 and PS5 | Shub Lahiri http://bit.ly/IgI8GN SOA Community Oracle Fusion Middleware Innovation Awards 2012, Call for Nominations #ofmaward #soa #bpm #soacommunity OTNArchBeat Updated SOA Documents now available in ITSO Reference Library http://bit.ly/I3Y6Sg Oracle Middleware Data Integrator & SOA - why 2 products better than one for integration? Webcast: Apr 24 10 AM PT http://bit.ly/IzmtKR Andrejus Baranovskis Red Samurai MDS Cleaner V2.0 http://fb.me/FxLVz82w SOA Community “@rluttikhuizen: Chapter 4 of SOA Made Simple book "Classification of Services" ready for collegial review” can #soacommunity get a preview? Xavier Verhaeghe #Gartner figures are out: #Oracle top in App Server market share (43.1%) and Relational #Database, too (48.8%) in 2011 Sabine Leitner WLS12c, Exa*, IDM, EM12c, DB @ Private, Public, Hybrid #Cloud Event 26.04. FFM #Oracle http://bit.ly/zcRuxi @OracleCloudZone @soacommunity Michel Schildmeijer@wlscommunity @MiddlewareMagic @OTNArchBeat @Oracle_Fusion Oracle WebLogic / SOA Suite 11g HACMP Cluster take-over http://lnkd.in/G78qMd Oracle Middleware Hear how ODI and SOA's unified approach are key to untangling your business. April 24 10AM PT http://bit.ly/IdcsUz #Oracle OTNArchBeat Using SAP Adapter with OSB 11g (PS3) | Shub Lahiri http://bit.ly/IswR9K SOA Community Integrating with Oracle Fusion Applications: Discovering Integration Artifacts https://blogs.oracle.com/governance /entry/integrating_with_oracle_fusion_ applications #soacommunity #oer #governance OracleBlogs Tuning B2B Server Engine Threads in SOA Suite 11g http://ow.ly/1iH5bx OracleBlogs Top Tweets SOA Partner Community April 2012 http://ow.ly/1iVHfA SOA Community Oracle SOA Suite 11g Database Growth Management http://wp.me/p10C8u-pi Sabine Leitner WLS12c,Exa*,IDM,EM12c, DB @ Private, Public, Hybrid #Cloud Event 24.04. München #Oracle http://bit.ly/zcRuxi @OracleCloudZone @soacommunity SOA Community Testing Business Rules by Mark Nelson http://redstack.wordpress.com/2012/ 04/18/testing-business-rules/ #soacommunity #soa #rules #oracle SOA CommunityTop Tweets SOA Partner Community - April 2012 http://wp.me/p10C8u-pn OTNArchBeat Webcast: Untangle Your Business with Oracle Unified SOA and Data Integration - April 24 http://bit.ly/IQexqT OTNArchBeat"Do more with SOA Integration: Best of Packt" contributors include @gschmutz, @llaszews, many others http://amzn.to/HVWwYt ServiceTechSymposium Symposium agenda page coming together - page launched today with keynotes, sessions to be added shortly. http://www.servicetechsymposium.com /agenda2012.php SOA Community Shipping Specialization plaques - congratulation #Fujitsu - request yours https://soacommunity.wordpress. com/2011/02/23/who-are-the-soa-experts-specialization-recognized-by-customers/ #soacommunity #OPN http://pic.twitter.com/YMRm2ion ServiceTechSymposium call for Presentations Submission Deadline Moved Up to May 21, 2012. Send your presentations submissions ASAP! ServiceTechSymposium Symposium Keynote by Vicente Navarro, European Space Agency, added to agenda: "SOA & Service-Orientation at the European Space Agency" SOA Community Running a large #soa project? Make sure you read - Oracle SOA Suite 11g Database Growth Management #soacommunity #opn SOA Community List all BPM Processes for a user by Yogesh l #bpm #oracle #soacommunity  For regular information on Oracle SOA Suite become a member in the SOA Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) Blog Twitter LinkedIn Mix Forum Technorati Tags: soacommunity, twitter,Oracle,SOA Community,Jürgen Kress,OPN,SOA,BPM

    Read the article

  • Dynamic Data Connections

    - by Tim Dexter
    I have had a long running email thread running between Dan and David over at Valspar and myself. They have built some impressive connectivity between their in house apps and BIP using web services. The crux of their problem has been that they have multiple databases that need the same report executed against them. Not such an unusual request as I have spoken to two customers in the last month with the same situation. Of course, you could create a report against each data connection and just run or call the appropriate report. Not too bad if you have two or three data connections but more than that and it becomes a maintenance nightmare having to update queries or layouts. Ideally you want to have just a single report definition on the BIP server and to dynamically set the connection to be used at runtime based on the user or system that the user is in. A quick bit of digging and help from Shinji on the development team and I had an answer. Rather embarassingly, the solution has been around since the Oct 2010 rollup patch last year. Still, I grabbed the latest Jan 2011 patch - check out Note 797057.1 for the latest available patches. Once installed, I used the best web service testing tool I have yet to come across - SoapUI. Just point it at the WSDL and you can check out the available services and their parameters and then test them too. The XML packet has a new dynamic data source entry. You can set you own custom JDBC connection or just specify an existing data source name thats defined on the server. <pub:runReport> <pub:reportRequest> <pub:attributeFormat>xml</pub:attributeFormat> <pub:attributeTemplate>0</pub:attributeTemplate> <pub:byPassCache>true</pub:byPassCache> <pub:dynamicDataSource> <pub:JDBCDataSource> <pub:JDBCDriverClass></pub:JDBCDriverClass> <pub:JDBCDriverType></pub:JDBCDriverType> <pub:JDBCPassword></pub:JDBCPassword> <pub:JDBCURL></pub:JDBCURL> <pub:JDBCUserName></pub:JDBCUserName> <pub:dataSourceName>Conn1</pub:dataSourceName> </pub:JDBCDataSource> </pub:dynamicDataSource> <pub:reportAbsolutePath>/Test/Employee Report/Employee Report.xdo</pub:reportAbsolutePath> </pub:reportRequest> <pub:userID>Administrator</pub:userID> <pub:password>Administrator</pub:password> </pub:runReport> So I have Conn1 and Conn2 defined that are connections to different databases. I can just flip the name, make the WS call and get the appropriate dataset in my report. Just as an example, here's my web service call java code. Just a case of bringing in the BIP java libs to my java project. publicReportServiceService = new PublicReportServiceService(); PublicReportService publicReportService = publicReportServiceService.getPublicReportService_v11(); String userID = "Administrator"; String password = "Administrator"; ReportRequest rr = new ReportRequest(); rr.setAttributeFormat("xml"); rr.setAttributeTemplate("1"); rr.setByPassCache(true); rr.setReportAbsolutePath("/Test/Employee Report/Employee Report.xdo"); rr.setReportOutputPath("c:\\temp\\output.xml"); BIPDataSource bipds = new BIPDataSource(); JDBCDataSource jds = new JDBCDataSource(); jds.setDataSourceName("Conn1"); bipds.setJDBCDataSource(jds); rr.setDynamicDataSource(bipds); try { publicReportService.runReport(rr, userID, password); } catch (InvalidParametersException e) { e.printStackTrace(); } catch (AccessDeniedException e) { e.printStackTrace(); } catch (OperationFailedException e) { e.printStackTrace(); } } Note, Im no java whiz kid or whizzy old bloke, at least not unless Ive had a coffee. JDeveloper has a nice feature where you point it at the WSDL and it creates everything to support your calling code for you. Couple of things to remember: 1. When you call the service, remember to set the bypass the cache option. Forget it and much scratching of your head and taking my name in vain will ensue. 2. My demo actually hit the same database but used two users, one accessed the base tables another views with the same name. For far too long I thought the connection swapping was not working. I was getting the same results for both users until I realized I was specifying the schema name for the table/view in my query e.g. select * from EMP.EMPLOYEES. So remember to have a generic query that will depend entirely on the connection. Its a neat feature if you want to be able to switch connections and only define a single report and call it remotely. Now if you want the connection to be set dynamically based on the user and the report run via the user interface, thats going to be more tricky ... need to think about that one!

    Read the article

  • Personal search – the future of search

    - by jamiet
    [Four months ago I wrote a meandering blog post on another blogging site entitled Personal search – the future of search. The points I made therein are becoming more relevant to what I'm reading about and hoping to get involved in in the future so I'm re-posting here to a wider audience to hopefully get some more feedback and guage reaction to it. This has been prompted by the book Pull by David Siegel that is forming my current holiday reading (recommended to me by a commenter on my previous post Interesting things – Twitter annotations and your phone as a web server) and in particular by Siegel's notion of us all in the future having a personal online data vault.] My one-time colleague Paul Dawson recently wrote an article called The Future of Search and in it he proposed some interesting ideas. Some choice quotes: The growth of Chinese search giant Baidu is an indicator that fully localised and tailored content and offerings have great traction with local audiences This trend is already driving an increase in the use of specialist searches … Look at how Farecast is now integrated into Bing for example, or how Flightstats is now integrated into Google. Search does not necessarily have to begin with a keyword, but could start instead with a click or a touch. Take a look at Retrievr. Start drawing a picture in the box and see what happens. This is certainly search without the need for typing in keywords search technology has advanced greatly in recent years. The recent launch of Microsoft Live Labs’ Pivot has given us a taste of what we can expect to see in the future This really got me thinking about where search might go in the future and as my mind wandered I realised that as the amount of data that we collect about ourselves increases so too will the need and the desire to search it. The amount of electronic data that exists about each and every person is increasing and in the near future I fully expect that we are going to be able to store personal data such as: A history of our location (in fact Google Latitude already offers this facility) Recordings of all our phone conversations Health information history (weight, blood pressure etc…) Energy usage Spending history What films we watch, what radio stations we listen to Voting history Of course, most of this stuff is already stored somewhere but crucially we don’t have easy access to it. My utilities supplier knows how much electricity I’m using but if I want to know for myself I have to go and dig through my statements (assuming I have kept them). Similarly my doctor probably has ready access to all of my health records, my bank knows exactly what I have spent my money on, my cable supplier knows what I watch on TV and my mobile phone supplier probably knows exactly where I am and where I’ve been for the past few years. Strange then that none of this electronic information is available to me in a way that I can really make use of it; after all, its MY information. Its MY data. I created it. That is set to change. As technologies mature and customers become more technically cognizant they will demand more access to the data that companies hold about them. The companies themselves will realise the benefit that they derive from giving users what they want and will embrace ways of providing it. As a result the amount of data that we store about ourselves is going to increase exponentially and the desire to search and derive value from that data is going to grow with it; we are about to enter the era of the “personal datastore” and we will want, and need, to search through it in order to make sense of it all. Its interesting then that today when we think of search we think of search engines and yet in these personal datastores we’re referring to data that search engines can’t touch because WE own it and we (hopefully) choose to keep it private. Someone, I know not who, is going to lead in this space by making it easy for us to search our data and retrieve information that we have either forgotten or maybe didn’t even know in the first place. We will learn new things about ourselves and about our habits; we will share these findings with whomever we choose; we will compare what we discover with others; we will collaborate for mutual benefit and, most of all, we will educate ourselves as to how to live our lives better. Search will be the means to that end, it will enable us to make sense of the wealth of information that we will collect day in day out. The future of search is personal, why would we be interested in anything else? @Jamiet Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Personal search – the future of search

    - by jamiet
    [Four months ago I wrote a meandering blog post on another blogging site entitled Personal search – the future of search. The points I made therein are becoming more relevant to what I'm reading about and hoping to get involved in in the future so I'm re-posting here to a wider audience to hopefully get some more feedback and guage reaction to it. This has been prompted by the book Pull by David Siegel that is forming my current holiday reading (recommended to me by a commenter on my previous post Interesting things – Twitter annotations and your phone as a web server) and in particular by Siegel's notion of us all in the future having a personal online data vault.] My one-time colleague Paul Dawson recently wrote an article called The Future of Search and in it he proposed some interesting ideas. Some choice quotes: The growth of Chinese search giant Baidu is an indicator that fully localised and tailored content and offerings have great traction with local audiences This trend is already driving an increase in the use of specialist searches … Look at how Farecast is now integrated into Bing for example, or how Flightstats is now integrated into Google. Search does not necessarily have to begin with a keyword, but could start instead with a click or a touch. Take a look at Retrievr. Start drawing a picture in the box and see what happens. This is certainly search without the need for typing in keywords search technology has advanced greatly in recent years. The recent launch of Microsoft Live Labs’ Pivot has given us a taste of what we can expect to see in the future This really got me thinking about where search might go in the future and as my mind wandered I realised that as the amount of data that we collect about ourselves increases so too will the need and the desire to search it. The amount of electronic data that exists about each and every person is increasing and in the near future I fully expect that we are going to be able to store personal data such as: A history of our location (in fact Google Latitude already offers this facility) Recordings of all our phone conversations Health information history (weight, blood pressure etc…) Energy usage Spending history What films we watch, what radio stations we listen to Voting history Of course, most of this stuff is already stored somewhere but crucially we don’t have easy access to it. My utilities supplier knows how much electricity I’m using but if I want to know for myself I have to go and dig through my statements (assuming I have kept them). Similarly my doctor probably has ready access to all of my health records, my bank knows exactly what I have spent my money on, my cable supplier knows what I watch on TV and my mobile phone supplier probably knows exactly where I am and where I’ve been for the past few years. Strange then that none of this electronic information is available to me in a way that I can really make use of it; after all, its MY information. Its MY data. I created it. That is set to change. As technologies mature and customers become more technically cognizant they will demand more access to the data that companies hold about them. The companies themselves will realise the benefit that they derive from giving users what they want and will embrace ways of providing it. As a result the amount of data that we store about ourselves is going to increase exponentially and the desire to search and derive value from that data is going to grow with it; we are about to enter the era of the “personal datastore” and we will want, and need, to search through it in order to make sense of it all. Its interesting then that today when we think of search we think of search engines and yet in these personal datastores we’re referring to data that search engines can’t touch because WE own it and we (hopefully) choose to keep it private. Someone, I know not who, is going to lead in this space by making it easy for us to search our data and retrieve information that we have either forgotten or maybe didn’t even know in the first place. We will learn new things about ourselves and about our habits; we will share these findings with whomever we choose; we will compare what we discover with others; we will collaborate for mutual benefit and, most of all, we will educate ourselves as to how to live our lives better. Search will be the means to that end, it will enable us to make sense of the wealth of information that we will collect day in day out. The future of search is personal, why would we be interested in anything else? @Jamiet Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • BIP and Mapviewer Mash Up I

    - by Tim Dexter
    I was out in Yellowstone last week soaking up various wildlife and a bit too much rain ... good to be back until the 95F heat yesterday. Taking a little break from the Excel templates; the dev folks are planing an Excel patch in the next week or so that will add a mass of new functionality. At the risk of completely mis leading you I'm going to hang back a while. What I have written so far holds true and will continue to do so. This week, I have been mostly eating 'mapviewer' ... answers on a post card please, TV show and character. I had a request to show how BIP can call mapviewer and render a dynamic map in an output. So I hit the books and colleagues for some answers. Mapviewer is Oracle's geographic information system, hereby known as GIS. I use it a lot in our BIEE demos where the interaction with the maps is very impressive. Need a map of California and its congressional districts? I have contacts; Jerry and David with their little black box of maps. Once in my possession I can build highly interactive, clickable maps that allow the user to drill into more information using a very friendly interface driving BIEE content and navigation. But what about maps in BIP output? Bryan Wise, who has written some articles on this blog did some work a while back with the PL/SQL API interface. The extract for the report called a function that in turn called the mapviewer server, passing a set of mapping requirements, it then returned a URL to a cached copy of that map. Easy to then have BIP render that image. Thats still very doable. You need to install a couple of packages and then load the mapviewer java APIs into the database. Then you can write your function to the APIs. A little involved? Maybe, but the database is doing all the heavy lifting for you. I thought I would investigate another method for getting the maps back into BIP. There is a URL interface you can call, this involves building an XML message to be passed to the mapviewer server. It's pretty straightforward to use on the mapviewer side. On the BIP side things are little more tricksy. After some unexpected messing about I finally got the ubiquitous Hello World map to render using the URL method. Not the most exciting map in the world, lots of ocean and a rather long URL to get it to render. http://127.0.0.1:9704/mapviewer/omserver?xml_request=%3Cmap_request%20title=%22Hello%20World%22%20datasource=%22cagis%22%20format=%22GIF_STREAM%22/%3E Notice all of the encoding in the URL string to handle the spaces, quotes, etc. All necessary to get BIP to make the call to the mapviewer server correctly without truncating the URL if it hits a real space rather than a %20. With that in mind constructing the URL was pretty simple. I'm not going to get into the content of the URL too much, for that you need to bone up on the mapviewer XML API. Check out the home page here and the documentation here. To make the template portable I used the standard CURRENT_SERVER_URL parameter from the BIP server and declared that in my template. <?param@begin:CURRENT_SERVER_URL;'myserver'?> Ignore the 'myserver', that was just a dummy value for testing at runtime it will resolve to: 'http://yourserver:port/xmlpserver' Not quite what we need as mapviewer has its own server path, in my case I needed 'mapviewer/omserver?xml_request=' as the fixed path to the mapviewer request URL. A little concatenation and substringing later I came up with <?param@begin:mURL;concat(substring($CURRENT_SERVER_URL,1,22),'mapviewer/omserver?xml_request=')?> Thats the basic URL that I can then build on. To get the Hello World map I need to add the following: <map_request title="Hello World" datasource="cagis" format="GIF_STREAM"/> Those angle brackets were the source of my headache, BIPs XSLT engine was attempting to process them rather than just pass them. Hok Min to the rescue ... again. I owe him lunch when I get out to HQ again! To solve the problem, I needed to escape all the characters and white space and then use native XSL to assign the string to a parameter. <xsl:param xdofo:ctx="begin"name="pXML">%3Cmap_request%20title=%22Hello%20World%22 %20datasource=%22cagis%22%20format=%22GIF_STREAM%22/%3E</xsl:param> I did not need to assign it to a parameter but I felt that if I were going to do anything more serious than Hello World like plotting points of interest on the map. I would need to dynamically build the URL, so using a set of parameters or variables that I then concatenated would be easier. Now I had the initial server string and the request all I then did was combine the two using a concat: concat($mURL,$pXML) Embedding that into an image tag: <fo:external-graphic src="url({concat($mURL,$pXML)})"/> and I was done. Notice the curly braces to get the concat evaluated prior to the image call. As you will see next time, building the XML message to go onto the URL can get quite complex but I have used it with some data. Ultimately, it would be easier to build an extension to BIP to handle the data to be plotted, it would then build the XML message, call mapviewer and return a URL to the map image for BIP to render. More on that next time ...

    Read the article

  • IE9 not rendering box-shadow Elements inside of Table Cells

    - by Rick Strahl
    Ran into an annoying problem today with IE 9. Slowly updating some older sites with CSS 3 tags and for the most part IE9 does a reasonably decent job of working with the new CSS 3 features. Not all by a long shot but at least some of the more useful ones like border-radius and box-shadow are supported. Until today I was happy to see that IE supported box-shadow just fine, but I ran into a problem with some old markup that uses tables for its main layout sections. I found that inside of a table cell IE fails to render a box-shadow. Below are images from Chrome (left) and IE 9 (right) of the same content: The download and purchase images are rendered with: <a href="download.asp" style="display:block;margin: 10px;"><img src="../images/download.gif" class="boxshadow roundbox" /></a> where the .boxshadow and .roundbox styles look like this:.boxshadow { -moz-box-shadow: 3px 3px 5px #535353; -webkit-box-shadow: 3px 3px 5px #535353; box-shadow: 3px 3px 5px #535353; } .roundbox { -moz-border-radius: 6px 6px 6px 6px; -webkit-border-radius: 6px; border-radius: 6px 6px 6px 6px; } And the Problem is… collapsed Table Borders Now normally these two styles work just fine in IE 9 when applied to elements. But the box-shadow doesn't work inside of this markup - because the parent container is a table cell.<td class="sidebar" style="border-collapse: collapse"> … <a href="download.asp" style="display:block;margin: 10px;"><img src="../images/download.gif" class="boxshadow roundbox" /></a> …</td> This HTML causes the image to not show a shadow. In actuality I'm not styling inline, but as part of my browser Reset I have the following in my master .css file:table { border-collapse: collapse; border-spacing: 0; } which has the same effect as the inline style. border-collapse by default inherits from the parent and so the TD inherits from table and tr - so TD tags are effectively collapsed. You can check out a test document that demonstrates this behavior here in this CodePaste.net snippet or run it here. How to work around this Issue To get IE9 to render the shadows inside of the TD tag correctly, I can just change the style explicitly NOT to use border-collapse:<td class="sidebar" style="border-collapse: separate; border-width: 0;"> Or better yet (thanks to David's comment below), you can add the border-collapse: separate to the .boxshadow style like this:.boxshadow { -moz-box-shadow: 3px 3px 5px #535353; -webkit-box-shadow: 3px 3px 5px #535353; box-shadow: 3px 3px 5px #535353; border-collapse: separate; } With either of these approaches IE renders the shadows correctly. Do you really need border-collapse? Should you bother with border-collapse? I think so! Collapsed borders render flat as a single fat line if a border-width and border-color are assigned, while separated borders render a thin line with a bunch of weird white space around it or worse render a old skool 3D raised border which is terribly ugly as well. So as a matter of course in any app my browser Reset includes the above code to make sure all tables with borders render the same flat borders. As you probably know, IE has all sorts of rendering issues in tables and on backgrounds (opacity backgrounds or image backgrounds) most of which is caused by the way that IE internally uses ActiveX filters to apply these effects. Apparently collapsed borders are yet one more item that causes problems with rendering. There you have it. Another crappy failure in IE we have to check for now, just one more reason to hate Internet Explorer. Luckily this one has a reasonably easy workaround. I hope this helps out somebody and saves them the hour I spent trying to figure out what caused this problem in the first place. Resources Sample HTML document that demonstrates the behavior Run the Sample© Rick Strahl, West Wind Technologies, 2005-2012Posted in HTML  Internet Explorer   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Real-world SignalR example, ditching ghetto long polling

    - by Jeff
    One of the highlights of BUILD last week was the announcement that SignalR, a framework for real-time client to server (or cloud, if you will) communication, would be a real supported thing now with the weight of Microsoft behind it. Love the open source flava! If you aren’t familiar with SignalR, watch this BUILD session with PM Damian Edwards and dev David Fowler. Go ahead, I’ll wait. You’ll be in a happy place within the first ten minutes. If you skip to the end, you’ll see that they plan to ship this as a real first version by the end of the year. Insert slow clap here. Writing a few lines of code to move around a box from one browser to the next is a way cool demo, but how about something real-world? When learning new things, I find it difficult to be abstract, and I like real stuff. So I thought about what was in my tool box and the decided to port my crappy long-polling “there are new posts” feature of POP Forums to use SignalR. A few versions back, I added a feature where a button would light up while you were pecking out a reply if someone else made a post in the interim. It kind of saves you from that awkward moment where someone else posts some snark before you. While I was proud of the feature, I hated the implementation. When you clicked the reply button, it started polling an MVC URL asking if the last post you had matched the last one the server, and it did it every second and a half until you either replied or the server told you there was a new post, at which point it would display that button. The code was not glam: // in the reply setup PopForums.replyInterval = setInterval("PopForums.pollForNewPosts(" + topicID + ")", 1500); // called from the reply setup and the handler that fetches more posts PopForums.pollForNewPosts = function (topicID) { $.ajax({ url: PopForums.areaPath + "/Forum/IsLastPostInTopic/" + topicID, type: "GET", dataType: "text", data: "lastPostID=" + PopForums.currentTopicState.lastVisiblePost, success: function (result) { var lastPostLoaded = result.toLowerCase() == "true"; if (lastPostLoaded) { $("#MorePostsBeforeReplyButton").css("visibility", "hidden"); } else { $("#MorePostsBeforeReplyButton").css("visibility", "visible"); clearInterval(PopForums.replyInterval); } }, error: function () { } }); }; What’s going on here is the creation of an interval timer to keep calling the server and bugging it about new posts, and setting the visibility of a button appropriately. It looks like this if you’re monitoring requests in FireBug: Gross. The SignalR approach was to call a message broker when a reply was made, and have that broker call back to the listening clients, via a SingalR hub, to let them know about the new post. It seemed weird at first, but the server-side hub’s only method is to add the caller to a group, so new post notifications only go to callers viewing the topic where a new post was made. Beyond that, it’s important to remember that the hub is also the means to calling methods at the client end. Starting at the server side, here’s the hub: using Microsoft.AspNet.SignalR.Hubs; namespace PopForums.Messaging { public class Topics : Hub { public void ListenTo(int topicID) { Groups.Add(Context.ConnectionId, topicID.ToString()); } } } Have I mentioned how awesomely not complicated this is? The hub acts as the channel between the server and the client, and you’ll see how JavaScript calls the above method in a moment. Next, the broker class and its associated interface: using Microsoft.AspNet.SignalR; using Topic = PopForums.Models.Topic; namespace PopForums.Messaging { public interface IBroker { void NotifyNewPosts(Topic topic, int lasPostID); } public class Broker : IBroker { public void NotifyNewPosts(Topic topic, int lasPostID) { var context = GlobalHost.ConnectionManager.GetHubContext<Topics>(); context.Clients.Group(topic.TopicID.ToString()).notifyNewPosts(lasPostID); } } } The NotifyNewPosts method uses the static GlobalHost.ConnectionManager.GetHubContext<Topics>() method to get a reference to the hub, and then makes a call to clients in the group matched by the topic ID. It’s calling the notifyNewPosts method on the client. The TopicService class, which handles the reply data from the MVC controller, has an instance of the broker new’d up by dependency injection, so it took literally one line of code in the reply action method to get things moving. _broker.NotifyNewPosts(topic, post.PostID); The JavaScript side of things wasn’t much harder. When you click the reply button (or quote button), the reply window opens up and fires up a connection to the hub: var hub = $.connection.topics; hub.client.notifyNewPosts = function (lastPostID) { PopForums.setReplyMorePosts(lastPostID); }; $.connection.hub.start().done(function () { hub.server.listenTo(topicID); }); The important part to look at here is the creation of the notifyNewPosts function. That’s the method that is called from the server in the Broker class above. Conversely, once the connection is done, the script calls the listenTo method on the server, letting it know that this particular connection is listening for new posts on this specific topic ID. This whole experiment enables a lot of ideas that would make the forum more Facebook-like, letting you know when stuff is going on around you.

    Read the article

  • What Keeps You from Changing Your Public IP Address and Wreaking Havoc on the Internet?

    - by Jason Fitzpatrick
    What exactly is preventing you (or anyone else) from changing their IP address and causing all sorts of headaches for ISPs and other Internet users? Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites. The Question SuperUser reader Whitemage is curious about what’s preventing him from wantonly changing his IP address and causing trouble: An interesting question was asked of me and I did not know what to answer. So I’ll ask here. Let’s say I subscribed to an ISP and I’m using cable internet access. The ISP gives me a public IP address of 60.61.62.63. What keeps me from changing this IP address to, let’s say, 60.61.62.75, and messing with another consumer’s internet access? For the sake of this argument, let’s say that this other IP address is also owned by the same ISP. Also, let’s assume that it’s possible for me to go into the cable modem settings and manually change the IP address. Under a business contract where you are allocated static addresses, you are also assigned a default gateway, a network address and a broadcast address. So that’s 3 addresses the ISP “loses” to you. That seems very wasteful for dynamically assigned IP addresses, which the majority of customers are. Could they simply be using static arps? ACLs? Other simple mechanisms? Two things to investigate here, why can’t we just go around changing our addresses, and is the assignment process as wasteful as it seems? The Answer SuperUser contributor Moses offers some insight: Cable modems aren’t like your home router (ie. they don’t have a web interface with simple point-and-click buttons that any kid can “hack” into). Cable modems are “looked up” and located by their MAC address by the ISP, and are typically accessed by technicians using proprietary software that only they have access to, that only runs on their servers, and therefore can’t really be stolen. Cable modems also authenticate and cross-check settings with the ISPs servers. The server has to tell the modem whether it’s settings (and location on the cable network) are valid, and simply sets it to what the ISP has it set it for (bandwidth, DHCP allocations, etc). For instance, when you tell your ISP “I would like a static IP, please.”, they allocate one to the modem through their servers, and the modem allows you to use that IP. Same with bandwidth changes, for instance. To do what you are suggesting, you would likely have to break into the servers at the ISP and change what it has set up for your modem. Could they simply be using static arps? ACLs? Other simple mechanisms? Every ISP is different, both in practice and how close they are with the larger network that is providing service to them. Depending on those factors, they could be using a combination of ACL and static ARP. It also depends on the technology in the cable network itself. The ISP I worked for used some form of ACL, but that knowledge was a little beyond my paygrade. I only got to work with the technician’s interface and do routine maintenance and service changes. What keeps me from changing this IP address to, let’s say, 60.61.62.75 and mess with another consumer’s internet access? Given the above, what keeps you from changing your IP to one that your ISP hasn’t specifically given to you is a server that is instructing your modem what it can and can’t do. Even if you somehow broke into the modem, if 60.61.62.75 is already allocated to another customer, then the server will simply tell your modem that it can’t have it. David Schwartz offers some additional insight with a link to a white paper for the really curious: Most modern ISPs (last 13 years or so) will not accept traffic from a customer connection with a source IP address they would not route to that customer were it the destination IP address. This is called “reverse path forwarding”. See BCP 38. Have something to add to the explanation? Sound off in the the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.     

    Read the article

  • RightNow CX @ OpenWorld: What to Experience

    - by Tony Berk
    We want to welcome our RightNow CX customers to Oracle OpenWorld next week. Get ready for a great week and a whole new experience! For a high level overview of what is going on during the week, please review these previous posts: Is There a Cloud Over OpenWorld? and What to "CRM" in San Francisco? CRM Highlights for OpenWorld '12. Also, don't forget you can add on the Customer Experience Summit @ OpenWorld to make your week even more complete and get involved with the Experience Revolution! Below is a highlight of only some of the RightNow related sessions at OpenWorld. Please use OpenWorld Schedule Builder or check the OpenWorld Content Catalog for all of the session details and any time or location changes. Tip: Pre-enrolled session registrants via Schedule Builder are allowed into the session rooms before anyone else, so Schedule Builder will guarantee you a seat. Many of the sessions below will likely be at capacity. No better way to start off than hearing where Oracle RightNow is going! Oracle RightNow CX Cloud Service Vision and Roadmap (CON9764) - Oct 1, 10:45 AM. Oracle RightNow CX Cloud Service combines Web, social, and contact center experiences for a unified, cross-channel service solution in the cloud, enabling organizations to increase sales and adoption, build trust, strengthen relationships, and reduce costs and effort. Come to this session to hear from David Vap and his team of Oracle experts about where the product is going and how Oracle is committed to accelerating the pace of innovation and value to its customers. Interested in the Cloud and want to know why some leading CIOs are moving to the cloud? You can hear first hand from CIOs from Emerson, Intuit and Overstock.com: CIOs and Governance in the Cloud (CON9767) - Oct 3, 11:45 AM.   And of course there are a number of sessions that drill down into more specific areas. Here are just a few: Deliver Outstanding Customer Experiences: Oracle RightNow Dynamic Agent Desktop Cloud Service (CON9771) - Oct 1, 4:45 PM. This session covers how companies have delivered exceptional customer experiences and how the Oracle RightNow Dynamic Agent Desktop Cloud Service roadmap will evolve in the future. The Oracle RightNow Contact Center Experience suite includes incident management, knowledge, guided processes, and other service capabilities to unify the customer experience across channels. Come learn about the powerful tools that enable even your junior agents to consistently provide outstanding service across all customer interaction channels. Self-Service in the Age of Data Intimacy (CON11516) - Oct 1, 3:15. Even though businesses are generating more and more data around their relationships and interactions with customers, very little of the information a business generates ends up available to the contact center and even less is made available to the online service experience. The generic one-size-fits-all approach that typifies most online service experiences ultimately fails to address all user needs, and that failure ultimately leads to the continued use of high-cost agent-assisted channels for low-value interactions. This session introduces Oracle RightNow Web Experience’s Virtual Assistant and discusses how you can deliver rich, engaging, highly personalized experiences with the quality of agent-assisted service at a much lower cost. Improve Chat Experiences: Best Practices for Chat Pilots and Deployments (CON11517) - Oct 1, 4:45 PM. Today’s organizations are challenged to grow revenue and retain customers with fewer resources, and many have turned to chat as an approach to improving the customer experience, increasing sales conversions, and reducing costs at the same time. From setting goals and metrics and training staff to customizing and tuning the solution, this session provides best practices and lessons learned from a broad set of implementations to help you get the most out of your chat solution. Differentiated Experience with Web Service (CON9770) - Oct 2, 1:15 PM. A reputation for excellent customer service can differentiate your brand and drive revenue. In this session, learn how to develop that reputation by transforming your online self-service into a highly interactive, branded customer experience. See live examples of how Oracle RightNow Web Experience has helped customers deliver on their Web service strategies. Unifying the Agent’s Engagement Console (CON11518) - Oct 2, 1:15 PM. Does your customer experience suffer because your agents are toggling between multiple tools? Do your agent productivity and morale suffer as well? Come to this session to learn how Oracle RightNow CX Cloud Service seamlessly unifies these disparate systems into a single engagement console. Regardless of channel, powerful adaptive tools consistently guide agents across contextually aware personalized workflows. Great agent experiences drive great customer experiences. Oracle RightNow CX Cloud Service and the Oracle Customer Experience Portfolio (CON9775) - Oct 3, 10:15 AM. This session covers how Oracle’s integrated suite of customer experience (CX) products fits with the Oracle CX portfolio of products (Oracle Fusion Customer Relationship Management; the Oracle ATG, Oracle Endeca, and Oracle Knowledge product families; and Oracle Business Intelligence) to increase revenues, strengthen customer relationships, and reduce costs across the entire end-to-end customer lifecycle for companies that sell to consumers and those that sell to businesses. Greater Insights from Customer Engagements (CON9773) Oct 4, 12:45 PM. In this session, hear how to leverage service interaction insights, customer feedback, and segmented service engagements to improve the customer experience. Discover how customers, such as J&P Cycles, learn and take action based on business insights gained through their customer engagements. Again, these are just some of the sessions, so check out the Content Catalog for details on Knowledge Management, Customization, Integration and more in the Oracle Develop stream for Customer Experience. Be sure to visit the Oracle DEMOgrounds in the Moscone West Exhibit Hall. If this is your first OpenWorld, welcome! If you are returning, hi again and enjoy!

    Read the article

  • Highlights from the Oracle Customer Experience Summit @ OpenWorld

    - by Richard Lefebvre
    The Oracle Customer Experience Summit was the first-ever event covering the full breadth of Oracle's CX portfolio -- Marketing, Sales, Commerce, and Service. The purpose of the Summit was to articulate the customer experience imperative and to showcase the suite of Oracle products that can help our customers create the best possible customer experience. This topic has always been a very important one, but now that there are so many alternative companies to do business with and because people have such public ways to voice their displeasure, it's necessary for vendors to have multiple listening posts in place to gauge consumer sentiment. They need to know what is going on in real time and be able to react quickly to turn negative situations into positive ones. Those can then be shared in a social manner to enhance the brand and turn the customer into a repeat customer. The Summit was focused on Oracle's portfolio of products and entirely dedicated to customers who are committed to building great customer experiences within their businesses. Rather than DBAs, the attendees were business people looking to collaborate with other like-minded experts and find out how Oracle can help in terms of technology, best practices, and expertise. The event was at the Westin St. Francis Hotel in San Francisco as part of Oracle OpenWorld. We had eight hundred people attend, which was great for the first year. Next year, there's no doubt in my mind, we can raise that number to 5,000. Alignment and Logic Oracle's Customer Experience portfolio is made up of a combination of acquired and organic products owned by many people who are new to Oracle. We include homegrown Fusion CRM, as well as RightNow, Inquira, OPA, Vitrue, ATG, Endeca, and many others. The attendees knew of the acquisitions, so naturally they wanted to see how the products all fit together and hear the logic behind the portfolio. To tell them about our alignment, we needed to be aligned. To accomplish that, a cross functional team at Oracle agreed on the messaging so that every single Oracle presenter could cover the big picture before going deep into a product or topic. Talking about the full suite of products in one session produced overflow value for other products. And even though this internal coordination was a huge effort, everyone saw the value for our customers and for our long-term cooperation and success. Keynotes, Workshops, and Tents of Innovation We scored by having Seth Godin as our keynote speaker ? always provocative and popular. The opening keynote was a session orchestrated by Mark Hurd, Anthony Lye, and me. Mark set the stage by giving real-world examples of bad customer experiences, Anthony clearly articulated the business imperative for addressing these experiences, and I brought it all to life by taking the audience around the Customer Lifecycle and showing demos and videos, with partners included at each of the stops around the lifecycle. Brian Curran, a VP for RightNow Product Strategy, presented a session that was in high demand called The Economics of Customer Experience. People loved hearing how to build a business case and justify the cost of building a better customer experience. John Kembel, another VP for RightNow Product Strategy, held a workshop that customers raved about. It was based on the journey mapping methodology he created, which is a way to talk to customers about where they want to make improvements to their customers' experiences. He divided the audience into groups led by facilitators. Each person had the opportunity to engage with experts and peers and construct some real takeaways. The conference hotel was across from Union Square so we used that space to set up Innovation Tents. During the day we served lunch in the tents and partners showed their different innovative ideas. It was very interesting to see all the technologies and advancements. It also gave people a place to mix and mingle and to think about the fringe of where we could all take these ideas. Product Portfolio Plus Thought Leadership Of course there is always room for improvement, but the feedback on the format of the conference was positive. Ninety percent of the sessions had either a partner or a customer teamed with an Oracle presenter. The presentations weren't dry, one-way information dumps, but more interactive. I just followed up with a CEO who attended the conference with his Head of Marketing. He told me that they are using John Kembel's journey mapping methodology across the organization to pull people together. This sort of thought leadership in these highly competitive areas gives Oracle permission to engage around the technology. We have to differentiate ourselves and it's harder to do on the product side because everyone looks the same on paper. But on thought leadership ? we can, and did, take some really big steps. David Vap Group Vice President Oracle Applications Product Development

    Read the article

  • Pace Layering Comes Alive

    - by Tanu Sood
    v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} Rick Beers is Senior Director of Product Management for Oracle Fusion Middleware. Prior to joining Oracle, Rick held a variety of executive operational positions at Corning, Inc. and Bausch & Lomb. With a professional background that includes senior management positions in manufacturing, supply chain and information technology, Rick brings a unique set of experiences to cover the impact that technology can have on business models, processes and organizations. Rick hosts the IT Leaders Editorial on a monthly basis. By now, readers of this column are quite familiar with Oracle AppAdvantage, a unified framework of middleware technologies, infrastructure and applications utilizing a pace layered approach to enterprise systems platforms. 1. Standardize and Consolidate core Enterprise Applications by removing invasive customizations, costly workarounds and the complexity that multiple instances creates. 2. Move business specific processes and applications to the Differentiate Layer, thus creating greater business agility with process extensions and best of breed applications managed by cross- application process orchestration. 3. The Innovate Layer contains all the business capabilities required for engagement, collaboration and intuitive decision making. This is the layer where innovation will occur, as people engage one another in a secure yet open and informed way. 4. Simplify IT by minimizing complexity, improving performance and lowering cost with secure, reliable and managed systems across the entire Enterprise. But what hasn’t been discussed is the pace layered architecture that Oracle AppAdvantage adopts. What is it, what are its origins and why is it relevant to enterprise scale applications and technologies? It’s actually a fascinating tale that spans the past 20 years and a basic understanding of it provides a wonderful context to what is evolving as the future of enterprise systems platforms. It all begins in 1994 with a book by noted architect Stewart Brand, of ’Whole Earth Catalog’ fame. In his 1994 book How Buildings Learn, Brand popularized the term ‘Shearing Layers’, arguing that any building is actually a hierarchy of pieces, each of which inherently changes at different rates. In 1997 he produced a 6 part BBC Series adapted from the book, in which Part 6 focuses on Shearing Layers. In this segment Brand begins to introduce the concept of ‘pace’. Brand further refined this idea in his subsequent book, The Clock of the Long Now, which began to link the concept of Shearing Layers to computing and introduced the term ‘pace layering’, where he proposes that: “An imperative emerges: an adaptive [system] has to allow slippage between the differently-paced systems … otherwise the slow systems block the flow of the quick ones and the quick ones tear up the slow ones with their constant change. Embedding the systems together may look efficient at first but over time it is the opposite and destructive as well.” In 2000, IBM architects Ian Simmonds and David Ing published a paper entitled A Shearing Layers Approach to Information Systems Development, which applied the concept of Shearing Layers to systems design and development. It argued that at the time systems were still too rigid; that they constrained organizations by their inability to adapt to changes. The findings in the Conclusions section are particularly striking: “Our starting motivation was that enterprises need to become more adaptive, and that an aspect of doing that is having adaptable computer systems. The challenge is then to optimize information systems development for change (high maintenance) rather than stability (low maintenance). Our response is to make it explicit within software engineering the notion of shearing layers, and explore it as the principle that systems should be built to be adaptable in response to the qualitatively different rates of change to which they will be subjected. This allows us to separate functions that should legitimately change relatively slowly and at significant cost from that which should be changeable often, quickly and cheaply.” The problem at the time of course was that this vision of adaptable systems was simply not possible within the confines of 1st generation ERP, which were conceived, designed and developed for standardization and compliance. It wasn’t until the maturity of open, standards based integration, and the middleware innovation that followed, that pace layering became an achievable goal. And Oracle is leading the way. Oracle’s AppAdvantage framework makes pace layering come alive by taking a strategic vision 20 years in the making and transforming it to a reality. It allows enterprises to retain and even optimize their existing ERP systems, while wrapping around those ERP systems three layers of capabilities that inherently adapt as needed, at a pace that’s optimal for the enterprise.

    Read the article

  • why jQuery.parseJSON is not a function?

    - by Pandiya Chendur
    I use the following jquery statements and i am getting the error, jQuery.parseJSON is not a function my function is, function Iteratejsondata() {var HfJsonValue = { "Table": [{ "Emp_Id": "3", "Identity_No": "", "Emp_Name": "Jerome", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Supervisior", "Desig_Description": "Supervisior of the Construction", "SalaryBasis": "Monthly", "FixedSalary": "25000.00" }, { "Emp_Id": "4", "Identity_No": "", "Emp_Name": "Mohan", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Acc ", "Desig_Description": "Accountant", "SalaryBasis": "Monthly", "FixedSalary": "200.00" }, { "Emp_Id": "5", "Identity_No": "", "Emp_Name": "Murugan", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Mason", "Desig_Description": "Mason", "SalaryBasis": "Weekly", "FixedSalary": "150.00" }, { "Emp_Id": "6", "Identity_No": "", "Emp_Name": "Ram", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Mason", "Desig_Description": "Mason", "SalaryBasis": "Weekly", "FixedSalary": "120.00" }, { "Emp_Id": "7", "Identity_No": "", "Emp_Name": "Raja", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Mason", "Desig_Description": "Mason", "SalaryBasis": "Weekly", "FixedSalary": "135.00" }, { "Emp_Id": "8", "Identity_No": "", "Emp_Name": "Raja kumar", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Mason Helper", "Desig_Description": "Mason Helper", "SalaryBasis": "Weekly", "FixedSalary": "105.00" }, { "Emp_Id": "9", "Identity_No": "", "Emp_Name": "Lakshmi", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Mason Helper", "Desig_Description": "Mason Helper", "SalaryBasis": "Weekly", "FixedSalary": "100.00" }, { "Emp_Id": "10", "Identity_No": "", "Emp_Name": "Palani", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Carpenter", "Desig_Description": "Carpenter", "SalaryBasis": "Weekly", "FixedSalary": "200.00" }, { "Emp_Id": "11", "Identity_No": "", "Emp_Name": "Annamalai", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Carpenter", "Desig_Description": "Carpenter", "SalaryBasis": "Weekly", "FixedSalary": "220.00" }, { "Emp_Id": "12", "Identity_No": "", "Emp_Name": "David", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Steel Fixer", "Desig_Description": "Steel Fixer", "SalaryBasis": "Weekly", "FixedSalary": "220.00" }, { "Emp_Id": "13", "Identity_No": "", "Emp_Name": "Chandru", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Steel Fixer", "Desig_Description": "Steel Fixer", "SalaryBasis": "Weekly", "FixedSalary": "220.00" }, { "Emp_Id": "14", "Identity_No": "", "Emp_Name": "Mani", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Steel Helper", "Desig_Description": "Steel Helper", "SalaryBasis": "Weekly", "FixedSalary": "175.00" }, { "Emp_Id": "15", "Identity_No": "", "Emp_Name": "Karthik", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Wood Fixer", "Desig_Description": "Wood Fixer", "SalaryBasis": "Weekly", "FixedSalary": "195.00" }, { "Emp_Id": "16", "Identity_No": "", "Emp_Name": "Bala", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Wood Fixer", "Desig_Description": "Wood Fixer", "SalaryBasis": "Weekly", "FixedSalary": "185.00" }, { "Emp_Id": "17", "Identity_No": "", "Emp_Name": "Tamil arasi", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Wood Helper", "Desig_Description": "Wood Helper", "SalaryBasis": "Weekly", "FixedSalary": "185.00" }, { "Emp_Id": "18", "Identity_No": "", "Emp_Name": "Perumal", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Cook", "Desig_Description": "Cook", "SalaryBasis": "Weekly", "FixedSalary": "105.00" }, { "Emp_Id": "19", "Identity_No": "", "Emp_Name": "Andiappan", "Address": "Madurai", "Date_Of_Birth": "", "Desig_Name": "Watchman", "Desig_Description": "Watchman", "SalaryBasis": "Weekly", "FixedSalary": "150.00"}] }; //var jsonObj = eval('(' + HfJsonValue + ')'); var jsonObj = jQuery.parseJSON(HfJsonValue); and my page looks like this <div id="Pagination" class="page-numbers"></div> <br style="clear:both;" /> <div id="Searchresult"></div> <div id="hiddenresult" style="display:none;"> </div> <script type="text/javascript"> var pagination_options = { num_edge_entries: 2, num_display_entries: 8, callback: pageselectCallback, items_per_page: 3 } function pageselectCallback(page_index, jq) { var items_per_page = pagination_options.items_per_page; var offset = page_index * items_per_page; var new_content = $('#hiddenresult div.resultsdiv').slice(offset, offset + items_per_page).clone(); $('#Searchresult').empty().append(new_content); return false; } function initPagination() { var num_entries = $('#hiddenresult div.resultsdiv').length; // Create pagination element $("#Pagination").pagination(num_entries, pagination_options); } $(document).ready(function() { Iteratejsondata(); initPagination(); }); </script> I ve inspected through firebug and saw all jquery files have been downloaded but why this is hapenning? Any suggestion....

    Read the article

  • CSS * {margin: 0;padding: 0;} override

    - by StealthRT
    Hey all, i am in need of some help with figuring out how to override the "* {margin: 0;padding: 0;}" in my css. The reason why is that i have this css: .postcomments { width: 547px; padding: 5px 5px 5px 5px; margin: 0 0 5px 0;} .postcomments a { text-decoration: underline;} .postcomments ul { margin: 0; padding: 0; list-style-type: none;} .postcomments ul li { width: 547px; margin: 0 0 5px 0; padding: 0; list-style-type: none;} .postcomments .right { color: #474747; font-size: 11px; background: #fff url('http://www.nextbowluser.com/img/ucBG.gif') no-repeat top left; line-height: 17px; padding: 5px 0 0 0; width: 430px; position: relative; float: right; min-height: 50px;} .postcomments .right .bottom { padding: 0 5px 15px 5px; background: #fff url('http://www.nextbowluser.com/img/ucBG.gif') no-repeat bottom right; min-height: 50px;} .postcomments .arrow { left: -15px; top: 20px; position: absolute;} .avatar { border: none !important;} .postcomments .left {float: left; margin: 0 7px 0 0;} .postcomments .gravatar { background: #fff; width: 80px; height: 60px; margin: 0 0px 0 0; padding: 3px;} .postcomments .name { font-size: 11px; margin: 2px 0 0 0; color: #000;} .avatar { border: none !important;} and it displays just fine WITHOUT the * {margin: 0;padding: 0;}: 1st comment 2nd comment 3rd comment However, when i add that to the CSS, it makes the comments stack wrong: 1st comment 2nd comment 3rd comment I would just take out the * {margin: 0;padding: 0;} but some other things on the page needs that in order for that to be displayed correctly. So my question is, how can i override the {margin: 0;padding: 0;} for just that postcomments part of the page? Thanks! :o) David

    Read the article

  • Paging through records (json data) using jQuery...

    - by Pandiya Chendur
    I have a JSON result that contains numerous records. I'd like to show the first five records in one page and create pager links which have to move to that page with five record so on. I don't want the page to refresh which is why I'm hoping for a combination of JavaScript and jQuery. My json data looks like this: {"Table" : [ {"Emp_Id" : "3","Identity_No" : "","Emp_Name" : "Jerome","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Supervisior","Desig_Description" : "Supervisior of the Construction","SalaryBasis" : "Monthly","FixedSalary" : "25000.00"}, {"Emp_Id" : "4","Identity_No" : "","Emp_Name" : "Mohan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Acc ","Desig_Description" : "Accountant","SalaryBasis" : "Monthly","FixedSalary" : "200.00"}, {"Emp_Id" : "5","Identity_No" : "","Emp_Name" : "Murugan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "150.00"}, {"Emp_Id" : "6","Identity_No" : "","Emp_Name" : "Ram","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "120.00"}, {"Emp_Id" : "7","Identity_No" : "","Emp_Name" : "Raja","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "135.00"}, {"Emp_Id" : "8","Identity_No" : "","Emp_Name" : "Raja kumar","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason Helper","Desig_Description" : "Mason Helper","SalaryBasis" : "Weekly","FixedSalary" : "105.00"}, {"Emp_Id" : "9","Identity_No" : "","Emp_Name" : "Lakshmi","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason Helper","Desig_Description" : "Mason Helper","SalaryBasis" : "Weekly","FixedSalary" : "100.00"}, {"Emp_Id" : "10","Identity_No" : "","Emp_Name" : "Palani","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Carpenter","Desig_Description" : "Carpenter","SalaryBasis" : "Weekly","FixedSalary" : "200.00"}, {"Emp_Id" : "11","Identity_No" : "","Emp_Name" : "Annamalai","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Carpenter","Desig_Description" : "Carpenter","SalaryBasis" : "Weekly","FixedSalary" : "220.00"}, {"Emp_Id" : "12","Identity_No" : "","Emp_Name" : "David","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Fixer","Desig_Description" : "Steel Fixer","SalaryBasis" : "Weekly","FixedSalary" : "220.00"}, {"Emp_Id" : "13","Identity_No" : "","Emp_Name" : "Chandru","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Fixer","Desig_Description" : "Steel Fixer","SalaryBasis" : "Weekly","FixedSalary" : "220.00"}, {"Emp_Id" : "14","Identity_No" : "","Emp_Name" : "Mani","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Helper","Desig_Description" : "Steel Helper","SalaryBasis" : "Weekly","FixedSalary" : "175.00"}, {"Emp_Id" : "15","Identity_No" : "","Emp_Name" : "Karthik","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Fixer","Desig_Description" : "Wood Fixer","SalaryBasis" : "Weekly","FixedSalary" : "195.00"}, {"Emp_Id" : "16","Identity_No" : "","Emp_Name" : "Bala","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Fixer","Desig_Description" : "Wood Fixer","SalaryBasis" : "Weekly","FixedSalary" : "185.00"}, {"Emp_Id" : "17","Identity_No" : "","Emp_Name" : "Tamil arasi","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Helper","Desig_Description" : "Wood Helper","SalaryBasis" : "Weekly","FixedSalary" : "185.00"}, {"Emp_Id" : "18","Identity_No" : "","Emp_Name" : "Perumal","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Cook","Desig_Description" : "Cook","SalaryBasis" : "Weekly","FixedSalary" : "105.00"}, {"Emp_Id" : "19","Identity_No" : "","Emp_Name" : "Andiappan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Watchman","Desig_Description" : "Watchman","SalaryBasis" : "Weekly","FixedSalary" : "150.00"} ] } And as of now my result looks like this, http://img401.imageshack.us/img401/2500/yuidtsum.jpg I have used jQuery for this: var jsonObj = JSON.parse(HfJsonValue); for (var i = jsonObj.Table.length - 1; i >= 0; i--) { var employee = jsonObj.Table[i]; $('<div class="resultsdiv"><br /><span class="resultName">' + employee.Emp_Name + '</span><span class="resultfields" style="padding-left:100px;">Category&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.Desig_Name + '</span><br /><br /><span id="SalaryBasis" class="resultfields">Salary Basis&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.SalaryBasis + '</span><span class="resultfields" style="padding-left:25px;">Salary&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.FixedSalary + '</span><span style="font-size:110%;font-weight:bolder;padding-left:25px;">Address&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.Address + '</span></div>') .insertAfter('#ResultsDiv'); } My image contains only 6 records as of now. Any suggestions?

    Read the article

  • What is in your Mathematica tool bag?

    - by Timo
    We all know that Mathematica is great, but it also often lacks critical functionality. What kind of external packages / tools / resources do you use with Mathematica? I'll edit (and invite anyone else to do so too) this main post to include resources which are focused on general applicability in scientific research and which as many people as possible will find useful. Feel free to contribute anything, even small code snippets (as I did below for a timing routine). Also, undocumented and useful features in Mathematica 7 and beyond you found yourself, or dug up from some paper/site are most welcome. Please include a short description or comment on why something is great or what utility it provides. If you link to books on Amazon with affiliate links please mention it, e.g., by putting your name after the link. Packages: LevelScheme is a package that greatly expands Mathematica's capability to produce good looking plots. I use it if not for anything else then for the much, much improved control over frame/axes ticks. David Park's Presentation Package ($50 - no charge for updates) Tools: MASH is Daniel Reeves's excellent perl script essentially providing scripting support for Mathematica 7. (This is finally built in as of Mathematica 8 with the -script option.) Resources: Wolfram's own repository MathSource has a lot of useful if narrow notebooks for various applications. Also check out the other sections such as Current Documentation, Courseware for lectures, and Demos for, well, demos. Books: Mathematica programming: an advanced introduction by Leonid Shifrin (web, pdf) is a must read if you want to do anything more than For loops in Mathematica. Quantum Methods with Mathematica by James F. Feagin (amazon) The Mathematica Book by Stephen Wolfram (amazon) (web) Schaum's Outline (amazon) Mathematica in Action by Stan Wagon (amazon) - 600 pages of neat examples and goes up to Mathematica version 7. Visualization techniques are especially good, you can see some of them on the author's Demonstrations Page. Mathematica Programming Fundamentals by Richard Gaylord (pdf) - A good concise introduction to most of what you need to know about Mathematica programming. Undocumented (or scarcely documented) Features: How to customize Mathematica keyboard shortcuts. See this question. How to inspect patterns and functions used by Mathematica's own functions. See this answer How to achieve Consistent size for GraphPlots in Mathematica? See this question.

    Read the article

  • Sentiment analysis with NLTK python for sentences using sample data or webservice?

    - by Ke
    I am embarking upon a NLP project for sentiment analysis. I have successfully installed NLTK for python (seems like a great piece of software for this). However,I am having trouble understanding how it can be used to accomplish my task. Here is my task: I start with one long piece of data (lets say several hundred tweets on the subject of the UK election from their webservice) I would like to break this up into sentences (or info no longer than 100 or so chars) (I guess i can just do this in python??) Then to search through all the sentences for specific instances within that sentence e.g. "David Cameron" Then I would like to check for positive/negative sentiment in each sentence and count them accordingly NB: I am not really worried too much about accuracy because my data sets are large and also not worried too much about sarcasm. Here are the troubles I am having: All the data sets I can find e.g. the corpus movie review data that comes with NLTK arent in webservice format. It looks like this has had some processing done already. As far as I can see the processing (by stanford) was done with WEKA. Is it not possible for NLTK to do all this on its own? Here all the data sets have already been organised into positive/negative already e.g. polarity dataset http://www.cs.cornell.edu/People/pabo/movie-review-data/ How is this done? (to organise the sentences by sentiment, is it definitely WEKA? or something else?) I am not sure I understand why WEKA and NLTK would be used together. Seems like they do much the same thing. If im processing the data with WEKA first to find sentiment why would I need NLTK? Is it possible to explain why this might be necessary? I have found a few scripts that get somewhat near this task, but all are using the same pre-processed data. Is it not possible to process this data myself to find sentiment in sentences rather than using the data samples given in the link? Any help is much appreciated and will save me much hair! Cheers Ke

    Read the article

  • Paging Through Records(json data) Using jQuery...

    - by Pandiya Chendur
    I have a JSON result that contains numerous records. I'd like to show the first five records in one page and create pager links which have to move to that page with five record so on. I don't want the page to refresh which is why I'm hoping a combination of JavaScript and jQuery... My json Data looks like this... {"Table" : [{"Emp_Id" : "3","Identity_No" : "","Emp_Name" : "Jerome","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Supervisior","Desig_Description" : "Supervisior of the Construction","SalaryBasis" : "Monthly","FixedSalary" : "25000.00"},{"Emp_Id" : "4","Identity_No" : "","Emp_Name" : "Mohan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Acc ","Desig_Description" : "Accountant","SalaryBasis" : "Monthly","FixedSalary" : "200.00"},{"Emp_Id" : "5","Identity_No" : "","Emp_Name" : "Murugan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "150.00"},{"Emp_Id" : "6","Identity_No" : "","Emp_Name" : "Ram","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "120.00"},{"Emp_Id" : "7","Identity_No" : "","Emp_Name" : "Raja","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "135.00"},{"Emp_Id" : "8","Identity_No" : "","Emp_Name" : "Raja kumar","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason Helper","Desig_Description" : "Mason Helper","SalaryBasis" : "Weekly","FixedSalary" : "105.00"},{"Emp_Id" : "9","Identity_No" : "","Emp_Name" : "Lakshmi","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason Helper","Desig_Description" : "Mason Helper","SalaryBasis" : "Weekly","FixedSalary" : "100.00"},{"Emp_Id" : "10","Identity_No" : "","Emp_Name" : "Palani","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Carpenter","Desig_Description" : "Carpenter","SalaryBasis" : "Weekly","FixedSalary" : "200.00"},{"Emp_Id" : "11","Identity_No" : "","Emp_Name" : "Annamalai","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Carpenter","Desig_Description" : "Carpenter","SalaryBasis" : "Weekly","FixedSalary" : "220.00"},{"Emp_Id" : "12","Identity_No" : "","Emp_Name" : "David","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Fixer","Desig_Description" : "Steel Fixer","SalaryBasis" : "Weekly","FixedSalary" : "220.00"},{"Emp_Id" : "13","Identity_No" : "","Emp_Name" : "Chandru","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Fixer","Desig_Description" : "Steel Fixer","SalaryBasis" : "Weekly","FixedSalary" : "220.00"},{"Emp_Id" : "14","Identity_No" : "","Emp_Name" : "Mani","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Helper","Desig_Description" : "Steel Helper","SalaryBasis" : "Weekly","FixedSalary" : "175.00"},{"Emp_Id" : "15","Identity_No" : "","Emp_Name" : "Karthik","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Fixer","Desig_Description" : "Wood Fixer","SalaryBasis" : "Weekly","FixedSalary" : "195.00"},{"Emp_Id" : "16","Identity_No" : "","Emp_Name" : "Bala","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Fixer","Desig_Description" : "Wood Fixer","SalaryBasis" : "Weekly","FixedSalary" : "185.00"},{"Emp_Id" : "17","Identity_No" : "","Emp_Name" : "Tamil arasi","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Helper","Desig_Description" : "Wood Helper","SalaryBasis" : "Weekly","FixedSalary" : "185.00"},{"Emp_Id" : "18","Identity_No" : "","Emp_Name" : "Perumal","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Cook","Desig_Description" : "Cook","SalaryBasis" : "Weekly","FixedSalary" : "105.00"},{"Emp_Id" : "19","Identity_No" : "","Emp_Name" : "Andiappan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Watchman","Desig_Description" : "Watchman","SalaryBasis" : "Weekly","FixedSalary" : "150.00"}]} And as of now my result looks like this, http://img401.imageshack.us/img401/2500/yuidtsum.jpg I have used jquery for this, var jsonObj = JSON.parse(HfJsonValue); for (var i = jsonObj.Table.length - 1; i >= 0; i--) { var employee = jsonObj.Table[i]; $('<div class="resultsdiv"><br /><span class="resultName">' + employee.Emp_Name + '</span><span class="resultfields" style="padding-left:100px;">Category&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.Desig_Name + '</span><br /><br /><span id="SalaryBasis" class="resultfields">Salary Basis&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.SalaryBasis + '</span><span class="resultfields" style="padding-left:25px;">Salary&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.FixedSalary + '</span><span style="font-size:110%;font-weight:bolder;padding-left:25px;">Address&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.Address + '</span></div>').insertAfter('#ResultsDiv'); } My image contains only 6 records as of now.. Any suggestions?

    Read the article

  • Good jquery pagination plugin to use with json Data...

    - by bala3569
    I am looking for a good jquery pagination plugin to use in my aspx page.... I have the following parameters currentpage,pagesize,TotalRecords,NumberofPages... I would like my plugin to same as stackoverflow paging .... EDIT: It should paginate through json data.... similar to this I use my json data and iterating with jquery var jsonObj = jQuery.parseJSON(HfJsonValue); for (var i = jsonObj.Table.length - 1; i >= 0; i--) { var employee = jsonObj.Table[i]; $('<div class="resultsdiv"><br /><span class="resultName">' + employee.Emp_Name + '</span><span class="resultfields" style="padding-left:100px;">Category&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.Desig_Name + '</span><br /><br /><span id="SalaryBasis" class="resultfields">Salary Basis&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.SalaryBasis + '</span><span class="resultfields" style="padding-left:25px;">Salary&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.FixedSalary + '</span><span style="font-size:110%;font-weight:bolder;padding-left:25px;">Address&nbsp;:</span>&nbsp;<span class="resultfieldvalues">' + employee.Address + '</span></div>').insertAfter('#ResultsDiv'); } There are 25 divs in my page as a result i want to show first five divs in page 1 and so on... Any suggestion... My HfJsonValue contains the following json data {"Table" : [{"Emp_Id" : "3","Identity_No" : "","Emp_Name" : "Jerome","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Supervisior","Desig_Description" : "Supervisior of the Construction","SalaryBasis" : "Monthly","FixedSalary" : "25000.00"},{"Emp_Id" : "4","Identity_No" : "","Emp_Name" : "Mohan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Acc ","Desig_Description" : "Accountant","SalaryBasis" : "Monthly","FixedSalary" : "200.00"},{"Emp_Id" : "5","Identity_No" : "","Emp_Name" : "Murugan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "150.00"},{"Emp_Id" : "6","Identity_No" : "","Emp_Name" : "Ram","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "120.00"},{"Emp_Id" : "7","Identity_No" : "","Emp_Name" : "Raja","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason","Desig_Description" : "Mason","SalaryBasis" : "Weekly","FixedSalary" : "135.00"},{"Emp_Id" : "8","Identity_No" : "","Emp_Name" : "Raja kumar","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason Helper","Desig_Description" : "Mason Helper","SalaryBasis" : "Weekly","FixedSalary" : "105.00"},{"Emp_Id" : "9","Identity_No" : "","Emp_Name" : "Lakshmi","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Mason Helper","Desig_Description" : "Mason Helper","SalaryBasis" : "Weekly","FixedSalary" : "100.00"},{"Emp_Id" : "10","Identity_No" : "","Emp_Name" : "Palani","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Carpenter","Desig_Description" : "Carpenter","SalaryBasis" : "Weekly","FixedSalary" : "200.00"},{"Emp_Id" : "11","Identity_No" : "","Emp_Name" : "Annamalai","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Carpenter","Desig_Description" : "Carpenter","SalaryBasis" : "Weekly","FixedSalary" : "220.00"},{"Emp_Id" : "12","Identity_No" : "","Emp_Name" : "David","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Fixer","Desig_Description" : "Steel Fixer","SalaryBasis" : "Weekly","FixedSalary" : "220.00"},{"Emp_Id" : "13","Identity_No" : "","Emp_Name" : "Chandru","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Fixer","Desig_Description" : "Steel Fixer","SalaryBasis" : "Weekly","FixedSalary" : "220.00"},{"Emp_Id" : "14","Identity_No" : "","Emp_Name" : "Mani","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Steel Helper","Desig_Description" : "Steel Helper","SalaryBasis" : "Weekly","FixedSalary" : "175.00"},{"Emp_Id" : "15","Identity_No" : "","Emp_Name" : "Karthik","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Fixer","Desig_Description" : "Wood Fixer","SalaryBasis" : "Weekly","FixedSalary" : "195.00"},{"Emp_Id" : "16","Identity_No" : "","Emp_Name" : "Bala","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Fixer","Desig_Description" : "Wood Fixer","SalaryBasis" : "Weekly","FixedSalary" : "185.00"},{"Emp_Id" : "17","Identity_No" : "","Emp_Name" : "Tamil arasi","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Wood Helper","Desig_Description" : "Wood Helper","SalaryBasis" : "Weekly","FixedSalary" : "185.00"},{"Emp_Id" : "18","Identity_No" : "","Emp_Name" : "Perumal","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Cook","Desig_Description" : "Cook","SalaryBasis" : "Weekly","FixedSalary" : "105.00"},{"Emp_Id" : "19","Identity_No" : "","Emp_Name" : "Andiappan","Address" : "Madurai","Date_Of_Birth" : "","Desig_Name" : "Watchman","Desig_Description" : "Watchman","SalaryBasis" : "Weekly","FixedSalary" : "150.00"}]}

    Read the article

  • Simplify CASE in VB.net code

    - by StealthRT
    Hey all, i am looking here to see if anyone would have a better way to acomplish this task below in less code. Select Case mainMenu.theNumOpened Case 1 Me.Text = "NBMsg1" Me.SetDesktopLocation(My.Computer.Screen.WorkingArea.Width - 302, 5) Case 2 Me.Text = "NBMsg2" Dim hwnd As IntPtr = FindWindow(vbNullString, "NBMsg1") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, Me.Height + 10, 0, 0, 1) Me.SetDesktopLocation(My.Computer.Screen.WorkingArea.Width - 302, 5) Case 3 Me.Text = "NBMsg3" Dim hwnd As IntPtr = FindWindow(vbNullString, "NBMsg2") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, Me.Height + 10, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg1") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 2) + 15, 0, 0, 1) Me.SetDesktopLocation(My.Computer.Screen.WorkingArea.Width - 302, 5) Case 4 Me.Text = "NBMsg4" Dim hwnd As IntPtr = FindWindow(vbNullString, "NBMsg3") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, Me.Height + 10, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg2") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 2) + 15, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg1") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 3) + 20, 0, 0, 1) Me.SetDesktopLocation(My.Computer.Screen.WorkingArea.Width - 302, 5) Case 5 Me.Text = "NBMsg5" Dim hwnd As IntPtr = FindWindow(vbNullString, "NBMsg4") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, Me.Height + 10, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg3") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 2) + 15, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg2") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 3) + 20, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg1") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 4) + 25, 0, 0, 1) Me.SetDesktopLocation(My.Computer.Screen.WorkingArea.Width - 302, 5) Case 6 Me.Text = "NBMsg6" Dim hwnd As IntPtr = FindWindow(vbNullString, "NBMsg5") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, Me.Height + 10, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg4") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 2) + 15, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg3") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 3) + 20, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg2") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 4) + 25, 0, 0, 1) hwnd = FindWindow(vbNullString, "NBMsg1") SetWindowPos(hwnd, 0, My.Computer.Screen.WorkingArea.Width - 302, (Me.Height * 5) + 30, 0, 0, 1) Me.SetDesktopLocation(My.Computer.Screen.WorkingArea.Width - 302, 5) Case Else Me.Close() Me.Dispose() End Select What it does is pass to it now many windows are currently already opened. So if one then of course it goes to case 1. If there are 2 opened then it moves the oldest down and puts the newest on top. etc etc. I have set it so that a max of 6 boxes can only be opened at one time. If anyone knows how i could also "slide" them down (kinda like an effect of jQuery) then that would also be, well awesome to know! :o) Any help/suggestions would be great! :o) David

    Read the article

< Previous Page | 120 121 122 123 124 125 126 127 128  | Next Page >