Search Results

Search found 1185 results on 48 pages for 'richard banks'.

Page 2/48 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • The .NET Rocks! Visual Studio 2010 Road Trip

    - by Laila
    Carl Franklin and Richard Campbell, the two .NET Rocks radio show hosts, have decided to set off to 15 cities in the US, between April 19th and May 7th, in their DotNetMobile (a 30 foot RV). What for you'll ask me? Well, to drive around the US, meet up with .NET developers, and show off the latest and greatest in Visual Studio 2010 and .NET 4.0! Each evening, they stop in a city and host a three hour event in front of a 100 to 300 crowd of developers, where Carl is showing off media features in Silverlight 4 and their road trip tracking application, whilst Richard is demo-ing the web performance testing features of VS2010 using his portable server rig. But before they take to the stage, they have a special guest brought in - a rock star from the Visual Studio world - whom they interview for an hour as a .NET Rock episode. So far, they've had - amongst others - Phil Haack, a Program Manager with the ASP.NET team working on ASP.NET MVC, Dan Fernandez, an Evangelism Manager in the Developer and Platform Evangelism team at Microsoft, and Beth Massi, Senior Program Manager on the Visual Studio Community Team at Microsoft. I love the fact that the audience gets a chance to participate, ask questions and have a great laugh, as you can hear in the first episode! Along the way, the .NET Rocks guys are giving away great prizes (including .NET Reflector Pro, ANTS Memory Profiler licenses, and "40" LCD TVs!). Even more out of the ordinary, at each stop on the road trip, one lucky attendee (who entered in the Ride Along competition) gets to jump in the RV with Carl and Richard and ride along with them to the next stop on the roadtrip. How cool is that! Richard told us: "Our first winner in Mountain View was Eric Ziko. I was looking for him to announce that he had won, when he found us and gave us a bottle of scotch he had brought just to say 'thanks for the great show'. We all had a toast from the bottle the next night when he headed back home." Cheeky! There's still space to a few of these events, so if you want to attend, register now, because it's first come first serve. We're grateful to Richard and Carl for giving us the opportunity to sponsor this major .NET event! A unique .NET adventure worth following for sure. Cheers, Laila

    Read the article

  • CRM Partner Community Monthly Newsletter

    - by Richard Lefebvre
    Dear CRM Partner, The Oracle EMEA CRM Partner Community Newsletter was broadcasted last Thursday to 2'000 contacts accross EMEA. If you want to be informed about Oracle Programs and Events for CRM partners by receiving this regular newsletter as well as other important communication, please register yourself to the EMEA CRM Community Pages. I look forward to welcome to our Community, Warm regards, Richard Lefebvre - EMEA CRM Partners Program Director

    Read the article

  • Oracle Sales Cloud Demo environments for partners

    - by Richard Lefebvre
    We are happy to inform our EMEA based CRM & CX partners that a new process for partners to get an access to the Oracle Sales Cloud (Fusion CRM SaaS) demo environment is in place.  If you are interested to take benefit of it, please send a short eMail to richard[email protected].  This offer - subject to final approval - is limited to EMEA based partners who have certified at least one sales and one presales on Oracle Sales Cloud.

    Read the article

  • Oracle CX Cloud promotions extended twice (products and duration)!

    - by Richard Lefebvre
    The Oracle Cloud promotions, which include free months and/or pre-approved discounts (subject to T&C's) is extended throughout November 2014 and includes more products including Oracle Fusion CRM Cloud Service (Oracle Sales Cloud), Oracle RightNow Cloud Service and Social Relationship Management. For more information about these exciting promotions, please contact your local Oracle CX Sales Representative, Oracle Direct, your Oracle Alliance Manager or richard[email protected].

    Read the article

  • Incrementing Assembly Version in TFS Builds and its affect over Other Build Definitions

    - by ssmantha
    A very common scenario while performing TFS builds is to increment version number of the assemblies. There are quite a few approaches of which I would like to share two links: Ewald Hofman’s Approach: http://www.ewaldhofman.nl/post/2010/05/13/Customize-Team-Build-2010-e28093-Part-5-Increase-AssemblyVersion.aspx#id_02e7b082-ce95-49a9-92e9-7dc88887b377 Richard Bank’s Approach : http://www.richard-banks.org/2010/07/how-to-versioning-builds-with-tfs-2010.html   Both these approaches work well, however there are scenarios where Editing and Checking–in the Assembly version information can create problems with Build Definitions meant for Continuous Integration, or gated Check-ins. You can suppress the Continuous Integration Builds while checking in the Assembly info file by just putting a comment “***NO_CI***” as specified by Ewald in his blog. However, if you have Gated Checkin in place, this can turn out to be difficult to suppress, I myself tried to suppress the Build Trigger during the check in process but things doesn’t turn out well. That’s where Richard’s solution comes as handy. Both the solutions have their own pros and cons, which I believe can only be experienced over a period of time. In case of Richard’s solution I believe that we don’t have any history of the Assembly Version Info file and when you take latest of the solution the information will be lost. If you notice closely, that suppressing the Continuous Integration (the NO_CI approach in check in comments) is a workaround provided by Microsoft, however I didn’t find anything to suppress the gated Checkin so far. Suggestions or Findings are most welcome.

    Read the article

  • .NET Rocks is on the Road Again!

    - by Scott Spradlin
    Carl and Richard are loading up the DotNetMobile (a 30 foot RV) and driving to our town again to show off their favorite bits of Visual Studio 2010 and .NET 4.0! Richard talks about Web load testing and Carl talks about Silverlight 4.0 and multimedia. And to make the night even more fun, they are going to bring a mystery rock star from the Visual Studio world to the event and interview them for a special .NET Rocks Road Trip show series. Along the way we’ll be giving away some great prizes, showing off some awesome technology and having a ton of laughs. So come out to the most fun you can have in a geeky evening - and learn a few things along the way about web load testing and Silverlight 4! And one lucky person at the event will win "Ride Along with Carl and Richard" and get to board the RV and ride with the boys to the next town on the tour -- Chicago. (don’t worry, they will get you home again!) So come out to the most fun you can have in a geeky evening – and find out what’s new and cool in Visual Studio 2010! To get insure we have sufficient food for everyone, please register for this event at http://stlnet.eventbrite.com This registration information will only be used to obtain accurate counts for food preparation. All other answers are optional and will be used for purely statistical analysis. No information will be shared outside the St. Louis .NET User Group. Here is a list of prizes to be given away at the event: Telerik Premium Collection Pre-Emptive One Year Commercial Runtime Intelligence license Red Gate ANTS Memory Profiler Quest Toad Extension for Visual Studio DevExpress Code Rush and Refactor Pro Grape City Active Report/BI Suite Grape City Spread 5.0 JetBrains Resharper Component One Studio for ASP.NET Component One Studio for Silverlight Please check out the event sponsors: Visit http://www.dotnetrocks.com/roadtrip for more information! Thursday, April 29, 2010 6:00 pm - Food and social 6:30 pm - .NET Rocks Interview 7:15 pm - Richard Campbell 8:00 pm - Carl Franklin 8:45 pm - prizes!

    Read the article

  • Four Emerging Payment Stories

    - by David Dorf
    The world of alternate payments has been moving fast of late.  Innovation in this area will help both consumers and retailers, but probably hurt the banks (at least that's the plan).  Here are four recent news items in this area: Dwolla, a start-up in Iowa, is trying to make credit cards obsolete.  Twelve guys in Des Moines are using $1.3M they raised to allow businesses to skip the credit card networks and avoid the fees.  Today they move about $1M a day across their network with an average transaction size of $500. Instead of charging merchants 2.9% plus $.30 per transaction, Dwolla charges a quarter -- yep, that coin featuring George Washington. Dwolla (Web + Dollar = Dwolla) avoids the credit networks and connects directly to bank accounts using the bank's ACH network.  They are signing up banks and merchants targeting both B2B and C2B as well as P2P payments.  They leverage social networks to notify people they have a money transfer, and also have a mobile app that uses GPS location. However, all is not rosy.  There have been complaints about unexpected chargebacks and with debit fees being reduced by the big banks, the need is not as pronounced.  The big banks are working on their own network called clearXchange that could provide stiff competition. VeriFone just bought European payment processor Point for around $1B.  By itself this would not have caught my attention except for the fact that VeriFone also announced the acquisition of GlobalBay earlier this month.  In addition to their core business of selling stand-beside payment terminals, with GlobalBay they get employee-operated mobile selling tools and with Point they get a very big payment processing platform. MasterCard and Intel announced a partnership around payments, starting with PayPass, MasterCard's new payment technology.  Intel will lend its expertise to add additional levels of security, which seems to be the biggest barrier for consumer adoption.  Everyone is scrambling to get their piece of cash transactions, which still represents 85% of all transactions. Apple was awarded another mobile payment patent further cementing the rumors that the iPhone 5 will support NFC payments.  As usual, Apple is upsetting the apple cart (sorry) by moving control of key data from the carriers to Apple.  With Apple's vast number of iTunes accounts, they have a ready-made customer base to use the payment infrastructure, which I bet will slowly transition people away from credit cards and toward cheaper ACH.  Gary Schwartz explains the three step process Apple is taking to become a payment processor. Below is a picture I drew representing payments in the retail industry. There's certainly a lot of innovation happening.

    Read the article

  • PHP Composer Not Working On Mac

    - by Richard Knop
    I have installed a bitnami mac stack mainly because I require at least PHP 5.4.7 version for my project. However, I have run into an issue with composer. This is the error I get when I run: php composer.phar install --dev The error: Richard-Knops-MacBook-Pro:my-project richardknop$ php composer.phar install --dev dyld: Library not loaded: /Applications/MAMP/Library/lib/libiconv.2.dylib Referenced from: /opt/local/bin/php Reason: Incompatible library version: php requires version 8.0.0 or later, but libiconv.2.dylib provides version 7.0.0 Trace/BPT trap Richard-Knops-MacBook-Pro:my-project richardknop$ How to solve it?

    Read the article

  • How to print a web page that contains flash

    - by Richard
    I am using the chromium browser to display the following web page: http://www.primaryworksheets.co.uk/multiws/multi23.html I want to print off this maths worksheet for my son, but all I ever get out of my printer is a blank page. The web page appears to be produced using flash. I have been to the software centre and re-installed the flash plugin, but that did not help. I don't seem to have problems printing anything else. Firefox isn't any better. Can anyone tell me what else I might try? I'm using '11.04'. Thanks, Richard

    Read the article

  • Wecam not detected

    - by Richard
    after much research ang googling, my web cam is still not recognized. In the time of Ubuntu 11.04 it was working fine. I did a fresh install of 11.10 and not more web cam. All the cheese and camera monotor and V4L tests and all failes with "cannot connect to /dev/video0" or equivalent. The output of 'lsusb' shows the webcam Bus 001 Device 002: ID 05a9:2640 OmniVision Technologies, Inc. OV2640 Webcam What I noticed is that neither /dev/v4l nor /dev/video* exists. If I re-install the v4l package all the /dev/video* are created, but no /dev/v4l If I reboot the /dev/video* are NOT created. I think the trouble is that the device are not created at boot time. I have a DELL Inspiron 1525 and until this fresh install the web cam worked fine. Can somebody help ? Thanks a lot Richard

    Read the article

  • Next Fusion CRM Webinar for Partners (Monday March 19th, 3pm GMT): Fusion CRM User Interface, Activity Streams and Opportunity management

    - by Richard Lefebvre
    The next session of our weekly Fusion CRM webinar for EMEA partners will take place Monday March the 19th at 3pm GMT / 4pm CET and will address the Fusion CRM User Interface, Activity Streams and Opportunity management In order to check the complete agenda and see login-details, please visit our dedicated microsite. How to join the dedicated microsite: Click on http://isdportal.oracle.com/isd_html/sf.htm Enter your Email Address in the corresponding field Enter fusion_crm in the “Access URL/Page Token” field Agenda: The list of sessions is published and will be regularly updated in the microsite. Duration: Each session lasts up to 60 minutes Webex: The respective webinar link and session ID are published in the microsite Audio:  The audio call details (telephone numbers by country, call number and password) is indicated in the microsite Slides: For your convenience, a pdf copy of each presentation will be stored in the microsite’s document section. We hope that this series of webcasts will be instrumental to your way of Fusion CRM business success!  For further information please contact me at richard[email protected]

    Read the article

  • CRM on Demand Marketing (ODM) Training for Partners - Munich - Dec 14-16th, 2011

    - by Richard Lefebvre
    We are pleased to inform you that we will be conducting a CRM on Demand Marketing (ODM) training workshop for EMEA Partners on December 14th - 16th in Munich (Germany). This 2.5-day Instructor lead course will be focusing on preparing Oracle CRM on Demand Practitioners and Clients to implement and operationalize ODM. The course will be consisting of 2 main tracks: Marketing User Product Specialist The course will be limited to 50 participants (with a maximum of 3 attendees per Partner Company). For detailed information and registration link, please refer to the invitation which will be sent shortly to the EMEA CRM on Demand Partners community or contact richard[email protected]. To join the EMEA CRM on Demand Partners community, follow this link:www.oracle.com/partners/goto/crm-saas-emea

    Read the article

  • How to reduce your CRM migration project's risks?

    - by Richard Lefebvre
    In this 1'38 video, discover how you can dramatically reduce your CRM migration project's risks, costs and budgets with the market leading CRM Data Migration tool that offers turnkey migration platform from Salesforce, Microsoft Dynamics or Oracle CRM OnDemand on to Oracle Sales Cloud. This solution is open to any Oracle CRM & CX implementation partner (e.g. System Integrators) as a mean to complement their own offer. For any additional details or for an introduction to the tool, please contact richard[email protected]  or visit www.conemis.com

    Read the article

  • USB Bose speakers with ubuntu 12.04

    - by Richard
    On a fresh install of ubuntu 12.04 my Bose USB weren't working at all. I read through a large number of posts here and elsewehre on Bose or USB speakers. I tried many things (modifying a number of different files, etc.). This worked... sort of: USB Audio on 0 volume on startup The AlsaMixer works after I turn up the volume, but the sound crackles a lot. Also, having to use a term mixer to change sound isn't ideal. Any other ideas? Thank you, Richard

    Read the article

  • Process Centric Banking: Loan Origination Solution

    - by Manish Palaparthy
    There is an old proverb that goes, "The difference between theory and practice is greater in practice than in theory". So, we keep doing numerous "Proof of Concepts" with our own products on various business cases to analyze them deeply, understand and explain to our customers. We then present our learnings as they happened. The awareness of each PoC should help readers increase the trustworthiness of the results coming out of these PoCs. I present one such PoC where we invested a lot of time&effort.  Process Centric Banking : Loan Origination Solution Loan Origination is a process by which a borrower applies for a new loan and the lender processes that application. Loan origination includes the series of steps taken by the bank from the point the customer shows interest in a loan product all the way to disbursal of funds. The Loan Origination process is relevant for many kind of lenders in Financial services: Banks, Credit Unions, NBFCs(Non Banking Financial Companies) and so on. For simplicity sake, I will use "Bank" as the lending institution in the rest of my article.  Loan Origination is one of the core processes for Banks as it is the process by which the it creates assets against which the Institution earns most of its profits from. A well tuned loan origination process can affect the Bank in many positive ways. Banks have always shown great interest in automating the loan origination process for the above reason. However, due the constant changes in customer environment, market dynamics, prevailing economic conditions, cost pressures & regulatory environment they run into lot of challenges. Let me categorize some of these challenges for you Customer Environment Multiple Channels: Customer can use any of the available channels (Internet Banking, Email, Fax, Branch, Phone Banking, ATM, Broker, Mobile, Snail Mail) to perform all or some of the activities related to her Visibility into the origination process: Expect immediate update on the status of loan processing & alert messages Reduced Turn Around Time: Expect loans to be processed with least turn around time Reduced loan processing fees: Partly due to market dynamics the customer expects the loan processing fee to be negligible Market Dynamics Competitive environment:  The competition keeps creating many variants of loan products to attract customers, the bank needs to create similar product variants with better offers to attract customers or keep existing ones Ability to migrate loans from one vendor to another: It has become really easy for retail customers to move from one bank to the other given the low fee of loan processing and highly attractive offers. How does the bank protect it's customer base while actively engaging with potential customers banking with competitor banks Flexibility to react to market developments: Market development greatly influence loan processing, underwriting, asset valuation, risk mitigation rules. Can the bank modify rules and policies, the idea is not just to react to market developments but to pro-actively manage new developments Economic conditions Constant change in various rates and their implications on the rates and rules applied when on-boarding a loan: How quickly can the bank apply changes to rates offered to customers when the central bank changes various rates Requirements of Audit by the central banker: Tough economic conditions have demanded much more stringent audit rules and tests. The banks needs to produce ready reports(historic & operational) for audit compliance Risk Mitigation: While risk mitigation has always been a key concern for the bank, this is the area where the bank's underwriters & risk analysts spend the maximum time when processing a loan application. In order to reduce TAT the bank cannot compromise on its risk mitigation strategies Cost pressures Reduce Cost of processing per application: To deliver a reduced loan processing fee to the customer, the bank needs to keep its cost per processing loan application low. Meet customer TAT expectations while reducing the queues and the systems being used to process the loan application: The loan application could potentially be spending a lot of time waiting in the queue for further processing. Different volumes & patterns of applications demand different queuing algorithms. The bank needs to have real-time visibility into these queues and have the flexibility to change queuing algorithms at runtime  Increase the use of electronic communication and reduce the branch channel usage: Lesser automation leads not only leads to Increased turn around time, it also impacts more costs to reach out to customers The objective of our PoC was to implement a Loan Origination Solution whose ownership lies with the bank and effectively meet the challenges listed above. We built a simple story board for the solution We then went about implementing our storyboard using Oracle BPM Suite, Webcenter Content : Imaging. The web UI has been built on ADF technolgies, while the integration with core-services has been implemented using the underlying SOA infrastructure. The BPM process model is quite exhaustive can meet all the challenges listed above to reasonable degree. A bank intending to implement an end-to-end Loan Origination Solution has multiple options at it's disposal. It can Develop a customer Loan Origination Application from scratch: Gives maximum opportunity to build what you want but inflexible to upgrade and maintain. Higher TCO in long term Buy a Packaged application & customize it: Customizing a generic loan application can be tedious and prove as difficult as above. Build it using many disparate & un-integrated tools: Initially seems easier than developing from scratch. But, without integrated tool sets this is not a viable approach either or A solution based on a Framework: Independent Services and Business Process Modeling provide decoupled architecture that is flexible. We built this framework end-to-end with processes the core process of loan origination & several sub-processes such as Analyse and define customer needs, customer credit verification, identity check processes, legal review process, New customer registration & risk assessment.

    Read the article

  • New user to Linux Mint and Alinux

    - by Richard
    which is better for a new Linux user, Linux Mint or Alinux i had trouble getting Alinux to run after install Mint ran fine. We are also with the Tucson Refugee ministry and we have computers given to us all the time then we reload the OS clean it up then we give them to the refugee families. We need something that will run on older computers with 256 or 512m of ram, and i do not want to use Micro Soft Please help. PS is there any other Linux OS you would recommend?????? Thanks Richard Smith [email protected]

    Read the article

  • How to use a local Leopard Server Mail server acting "like" an Exchange mail server

    - by Richard Chevre
    We have a local Exchange 2003 server (company .local) who is collecting POP3 mail accounts on a distant (company .com) mailserver. The mails are collected by the Exchange server every 5-10 minutes and stored locally (on company .local), so the users can read them without going on the "real" mail server (company.com) What was explaned to me is that the mail collection is made with POP Now we are migrating on Snow Leopard Server. We have chosen to use a new extension for our local domain: .leo So our mailserver's FQDN is mail.company.leo, and the users have a user [email protected] formated mail address. A) All works fine except that I can't find how to tell the mail.company.leo that he must retreive the mails from the "real" public server (mail.company.com) I'm hoping to use IMAP and not POP. I can send mail using SMTP relay from mail.company.leo but (I know it's trivial) answering is not possible, even if I specify the reply-to as [email protected] (this seems to be related to A) ) I don't know if it's very complicated (I suspect not, but...) to achieve what I want to do, and I'm not a genius. But as I'm a little bit lost, I hopesomebody can or will help me. Solving this will allow us to use iCal invitations too, so a lot of services depends of these mailserver settings Some of you discuss the fact thta we choose to use a "new" tld with the .leo extension. We have no problem for that, we could use .local. no problem ;) We used .leo instead of .local just to differentiate the two systems (Exchange and SnowLeopardServer). The question was not about that, it was just to know if we can set a SnowLeopard mail server to act like an Exchange Server. Again thank you for your advice and help Richard Thanks in advance Richard

    Read the article

  • Restful Services, oData, and Rest Sharp

    - by jkrebsbach
    After a great presentation by Jason Sheehan at MDC about RestSharp, I decided to implement it. RestSharp is a .Net framework for consuming restful data sources via either Json or XML. My first step was to put together a Restful data source for RestSharp to consume.  Staying entirely withing .Net, I decided to use Microsoft's oData implementation, built on System.Data.Services.DataServices.  Natively, these support Json, or atom+pub xml.  (XML with a few bells and whistles added on) There are three main steps for creating an oData data source: 1)  override CreateDSPMetaData This is where the metadata data is returned.  The meta data defines the structure of the data to return.  The structure contains the relationships between data objects, along with what properties the objects expose.  The meta data can and should be somehow cached so that the structure is not rebuild with every data request. 2) override CreateDataSource The context contains the data the data source will publish.  This method is the conduit which will populate the metadata objects to be returned to the requestor. 3) implement static InitializeService At this point we can set up security, along with setting up properties of the web service (versioning, etc)   Here is a web service which publishes stock prices for various Products (stocks) in various Categories. namespace RestService {     public class RestServiceImpl : DSPDataService<DSPContext>     {         private static DSPContext _context;         private static DSPMetadata _metadata;         /// <summary>         /// Populate traversable data source         /// </summary>         /// <returns></returns>         protected override DSPContext CreateDataSource()         {             if (_context == null)             {                 _context = new DSPContext();                 Category utilities = new Category(0);                 utilities.Name = "Electric";                 Category financials = new Category(1);                 financials.Name = "Financial";                                 IList products = _context.GetResourceSetEntities("Products");                 Product electric = new Product(0, utilities);                 electric.Name = "ABC Electric";                 electric.Description = "Electric Utility";                 electric.Price = 3.5;                 products.Add(electric);                 Product water = new Product(1, utilities);                 water.Name = "XYZ Water";                 water.Description = "Water Utility";                 water.Price = 2.4;                 products.Add(water);                 Product banks = new Product(2, financials);                 banks.Name = "FatCat Bank";                 banks.Description = "A bank that's almost too big";                 banks.Price = 19.9; // This will never get to the client                 products.Add(banks);                 IList categories = _context.GetResourceSetEntities("Categories");                 categories.Add(utilities);                 categories.Add(financials);                 utilities.Products.Add(electric);                 utilities.Products.Add(electric);                 financials.Products.Add(banks);             }             return _context;         }         /// <summary>         /// Setup rules describing published data structure - relationships between data,         /// key field, other searchable fields, etc.         /// </summary>         /// <returns></returns>         protected override DSPMetadata CreateDSPMetadata()         {             if (_metadata == null)             {                 _metadata = new DSPMetadata("DemoService", "DataServiceProviderDemo");                 // Define entity type product                 ResourceType product = _metadata.AddEntityType(typeof(Product), "Product");                 _metadata.AddKeyProperty(product, "ProductID");                 // Only add properties we wish to share with end users                 _metadata.AddPrimitiveProperty(product, "Name");                 _metadata.AddPrimitiveProperty(product, "Description");                 EntityPropertyMappingAttribute att = new EntityPropertyMappingAttribute("Name",                     SyndicationItemProperty.Title, SyndicationTextContentKind.Plaintext, true);                 product.AddEntityPropertyMappingAttribute(att);                 att = new EntityPropertyMappingAttribute("Description",                     SyndicationItemProperty.Summary, SyndicationTextContentKind.Plaintext, true);                 product.AddEntityPropertyMappingAttribute(att);                 // Define products as a set of product entities                 ResourceSet products = _metadata.AddResourceSet("Products", product);                 // Define entity type category                 ResourceType category = _metadata.AddEntityType(typeof(Category), "Category");                 _metadata.AddKeyProperty(category, "CategoryID");                 _metadata.AddPrimitiveProperty(category, "Name");                 _metadata.AddPrimitiveProperty(category, "Description");                 // Define categories as a set of category entities                 ResourceSet categories = _metadata.AddResourceSet("Categories", category);                 att = new EntityPropertyMappingAttribute("Name",                     SyndicationItemProperty.Title, SyndicationTextContentKind.Plaintext, true);                 category.AddEntityPropertyMappingAttribute(att);                 att = new EntityPropertyMappingAttribute("Description",                     SyndicationItemProperty.Summary, SyndicationTextContentKind.Plaintext, true);                 category.AddEntityPropertyMappingAttribute(att);                 // A product has a category, a category has products                 _metadata.AddResourceReferenceProperty(product, "Category", categories);                 _metadata.AddResourceSetReferenceProperty(category, "Products", products);             }             return _metadata;         }         /// <summary>         /// Based on the requesting user, can set up permissions to Read, Write, etc.         /// </summary>         /// <param name="config"></param>         public static void InitializeService(DataServiceConfiguration config)         {             config.SetEntitySetAccessRule("*", EntitySetRights.All);             config.DataServiceBehavior.MaxProtocolVersion = DataServiceProtocolVersion.V2;             config.DataServiceBehavior.AcceptProjectionRequests = true;         }     } }     The objects prefixed with DSP come from the samples on the oData site: http://www.odata.org/developers The products and categories objects are POCO business objects with no special modifiers. Three main options are available for defining the MetaData of data sources in .Net: 1) Generate Entity Data model (Potentially directly from SQL Server database).  This requires the least amount of manual interaction, and uses the edmx WYSIWYG editor to generate a data model.  This can be directly tied to the SQL Server database and generated from the database if you want a data access layer tightly coupled with your database. 2) Object model decorations.  If you already have a POCO data layer, you can decorate your objects with properties to statically inform the compiler how the objects are related.  The disadvantage is there are now tags strewn about your business layer that need to be updated as the business rules change.  3) Programmatically construct metadata object.  This is the object illustrated above in CreateDSPMetaData.  This puts all relationship information into one central programmatic location.  Here business rules are constructed when the DSPMetaData response object is returned.   Once you have your service up and running, RestSharp is designed for XML / Json, along with the native Microsoft library.  There are currently some differences between how Jason made RestSharp expect XML with how atom+pub works, so I found better results currently with the Json implementation - modifying the RestSharp XML parser to make an atom+pub parser is fairly trivial though, so use what implementation works best for you. I put together a sample console app which calls the RestSvcImpl.svc service defined above (and assumes it to be running on port 2000).  I used both RestSharp as a client, and also the default Microsoft oData client tools. namespace RestConsole {     class Program     {         private static DataServiceContext _ctx;         private enum DemoType         {             Xml,             Json         }         static void Main(string[] args)         {             // Microsoft implementation             _ctx = new DataServiceContext(new System.Uri("http://localhost:2000/RestServiceImpl.svc"));             var msProducts = RunQuery<Product>("Products").ToList();             var msCategory = RunQuery<Category>("/Products(0)/Category").AsEnumerable().Single();             var msFilteredProducts = RunQuery<Product>("/Products?$filter=length(Name) ge 4").ToList();             // RestSharp implementation                          DemoType demoType = DemoType.Json;             var client = new RestClient("http://localhost:2000/RestServiceImpl.svc");             client.ClearHandlers(); // Remove all available handlers             // Set up handler depending on what situation dictates             if (demoType == DemoType.Json)                 client.AddHandler("application/json", new RestSharp.Deserializers.JsonDeserializer());             else if (demoType == DemoType.Xml)             {                 client.AddHandler("application/atom+xml", new RestSharp.Deserializers.XmlDeserializer());             }                          var request = new RestRequest();             if (demoType == DemoType.Json)                 request.RootElement = "d"; // service root element for json             else if (demoType == DemoType.Xml)             {                 request.XmlNamespace = "http://www.w3.org/2005/Atom";             }                              // Return all products             request.Resource = "/Products?$orderby=Name";             RestResponse<List<Product>> productsResp = client.Execute<List<Product>>(request);             List<Product> products = productsResp.Data;             // Find category for product with ProductID = 1             request.Resource = string.Format("/Products(1)/Category");             RestResponse<Category> categoryResp = client.Execute<Category>(request);             Category category = categoryResp.Data;             // Specialized queries             request.Resource = string.Format("/Products?$filter=ProductID eq {0}", 1);             RestResponse<Product> productResp = client.Execute<Product>(request);             Product product = productResp.Data;                          request.Resource = string.Format("/Products?$filter=Name eq '{0}'", "XYZ Water");             productResp = client.Execute<Product>(request);             product = productResp.Data;         }         private static IEnumerable<TElement> RunQuery<TElement>(string queryUri)         {             try             {                 return _ctx.Execute<TElement>(new Uri(queryUri, UriKind.Relative));             }             catch (Exception ex)             {                 throw ex;             }         }              } }   Feel free to step through the code a few times and to attach a debugger to the service as well to see how and where the context and metadata objects are constructed and returned.  Pay special attention to the response object being returned by the oData service - There are several properties of the RestRequest that can be used to help troubleshoot when the structure of the response is not exactly what would be expected.

    Read the article

  • Twitter API Voting System

    - by Richard Jones
    So I blatantly got this idea from the MIX 10 event. At MIX they held a rockband talent competition type thing (I’m not quite sure of all the details).    But the interesting part for me is how they collected votes. They used Twitter (what else, when you have a few thousand geeks available to you). The basic idea was that you tweeted your vote with a # tag, i.e #ROCKBANDVOTE vote Richard How cool….    So the question is how do you write something to collate and count all the votes?   Time to press the magic Visual Studio new Project button… Twitter has a really nice API that can be invoked from .NET.   This is the snippet of code that will search for any given phrase i.e #ROCKBANDVOTE   public static XmlDocument GetSearchResults(string searchfor) { return GetSearchResults(searchfor, ""); }   public static XmlDocument GetSearchResults(string searchfor, string sinceid) { XmlDocument retdoc = new XmlDocument();   try { string url = "http://search.twitter.com/search.atom?&q=" + searchfor; if (sinceid.Length > 0) url += "since_id=" + sinceid; HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url); request.Method = "POST"; request.ContentType = "application/x-www-form-urlencoded"; WebResponse res = request.GetResponse(); retdoc.Load(res.GetResponseStream()); res.Close();   } catch { } return retdoc; } } I’ve got two overloads, that optionally let you pass in the last ID to look for as well as what you want to search for. Note that Twitter rate limits the amount of requests you can send,  see http://apiwiki.twitter.com/Rate-limiting So realistically I wanted my app to run every hour or so and only pull out results that haven’t been received before (hence the overload to pass in the sinceid parameter). I’ll post the code when finished that parses the returned XML.

    Read the article

  • New Fusion CRM Webinars for Partners dates and subjects announced

    - by Richard Lefebvre
    New Fusion CRM Weekly webinars dates and subjects have been announced! Visit our microsite to find out the sessions to come and mark them in your agenda. The next session will take place Monday April the 2nd at 3pm GMT / 4pm CET and will address the Fusion CRM Sales Planning  In order to check the complete agenda and see login-details, please visit our dedicated microsite. How to join the dedicated microsite: Click on http://isdportal.oracle.com/isd_html/sf.htm Enter your Email Address in the corresponding field Enter fusion_crm in the “Access URL/Page Token” field Agenda: The list of sessions is published and will be regularly updated in the microsite. Duration: Each session lasts up to 60 minutes Webex: The respective webinar link and session ID are published in the microsite Audio:  The audio call details (telephone numbers by country, call number and password) is indicated in the microsite Slides: For your convenience, a pdf copy of each presentation will be stored in the microsite’s document section. We hope that this series of webcasts will be instrumental to your way of Fusion CRM business success!  For further information please contact me at richard[email protected]

    Read the article

  • Big Success for the 2012 edition of the Oracle EMEA Cloud CRM Partner Community Forum!

    - by Richard Lefebvre
    The 2012 edition of the Oracle EMEA Cloud Partner Community Forum took place on march 28&29 in Madrid. 100 participants from all over Europe had a chance to interact with the Oracle Cloud CRM Product Management team about multiple subjects such as Oracle Cloud CRM and Social Network solutions strategy, RightNow acquisition, Fusion CRM Business opportunities for partners, etc. During his opening keynote, Anthony Lye (Oracle Senior Vice-President and head of Oracle CRM) presented the current Fusion CRM business status, disclosed the overall Oracle CRM product strategy and responded to many questions from the audience. Later on that day, 8 Oracle ISV's presented their Oracle Cloud CRM add-on's and highlighted the value that System Integrators can benefit from as part of a Cloud CRM project. After a very friendly networking diner in a Spanish restaurant, the second day was dedicated to Fusion CRM, with a deeper dive into its major components (Sales, Sales Planning, Marketing) including the Fusion Composers. Briefings on Oracle Consulting Services Fusion CRM dedicated offerings and Fusion CRM Partner Programs concluded the day and the event. All participants rated the event as "good" to "Excellent" and mentioned that it was meaningful for them to plan their Oracle Cloud CRM based business in the near future. We look forward to organise a similar event next year and to welcome even more partners! Richard Lefebvre

    Read the article

  • Sweating over an ROI model for Social?

    - by Richard Lefebvre
    In this article, Richard Beattie (Oracle EMEA SRM Director) argues that ROI is not the only measure for organizations considering their Social strategy. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • Group Policy fault - Students force

    - by Richard 'Bean' Williams
    Work at a school and we've got a scenario. We block F8 on all computers so students cannot access Safe Mode to bypass Group Policy... But students are logging into their accounts using AD, and they are turning them off half way through. Then they are claiming that when they login next time, they have Local Administrator accounts. Is this right, but we have blocked F8 and Startup repair, so wondering how they actually did it. Cheers Richard

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >