Search Results

Search found 19191 results on 768 pages for 'core location'.

Page 467/768 | < Previous Page | 463 464 465 466 467 468 469 470 471 472 473 474  | Next Page >

  • The path to MCSE:SharePoint. The Overview.

    - by Enrique Lima
    There have been some changes to certifications recently.  And with that new challenges and requirements.  In the past we had MCTS and MCITP or MCPD on a specific product and that was it, now the story is somewhat different. You will need to not only know the product (yes, I am one that still focuses on knowing a product not the test) but also the environment on which it sits or lives (therefore Windows Server and supporting services). The requirements for MCSE: SharePoint now take you through the MCSA:Windows Server 2012.  Many have questioned this, I don’t. Why? I have seen plenty of “accidental SharePoint Farm Administrators” that have no background with the Server OS, much less with the services (like DNS and IIS).  Again, I am not saying this will guarantee knowledge but it does in some way require exposure to it. So, again, the next number of posts will be to provide guidance for the needed knowledge for the test requirements.  If you have seen the way I go about this, you then know I don’t focus on exam questions, but rather providing guidance to the TechNet and MSDN documentation to get to know the product.  I will also in this case go through the process to setup your virtual environment to play with the products and get to know them. Now, the requirements themselves are: MCSA: Windows Server 2012. Exam 70-410: Installing and Configuring Windows Server 2012 Exam 70-411: Administering Windows Server 2012 Exam 70-412: Configuring Advanced Windows Server 2012 Services SharePoint specific exams. Exam 70-331: Core Solutions of Microsoft SharePoint Server 2013 Exam 70-332: Advanced Solutions of Microsoft SharePoint Server 2013 Passing the 5 exams will grant you the MCSE: SharePoint credential.

    Read the article

  • We'll be at QCon San Francisco!

    - by Carlos Chang
    Oracle Technology Network is a Platinum sponsor at QCon San Francisco. Don’t miss these great developer focused sessions: Shay Shmeltzer - How we simplified Web, Mobile and Cloud development for our own developers? - the Oracle Story Over the past several years, Oracle has beendeveloping a new set of enterprise applications in what is probably one of the largest Java based development project in the world. How do you take 3000 developers and make them productive? How do you insure the delivery of cutting edge UIs for both Mobile and Web channels? How do you enable Cloud based development and deployment? Come and learn how we did it at Oracle, and see how the same technologies and methodologies can apply to your development efforts. Dan Smith - Project Lambda in Java 8 Java SE 8 will include major enhancements to the Java Programming Language and its core libraries.  This suite of new features, known as Project Lambda in the OpenJDK community, includes lambda expressions, default methods, and parallel collections (and much more!).  The result will be a next-generation Java programming experience with more flexibility and better abstractions.   This talk will introduce the new Java features and offer a behind-the-scenes view of how they evolved and why they work the way that they do. Arun Gupta - JSR 356: Building HTML5 WebSocket Applications in Java The family of HTML5 technologies has pushed the pendulum away from rich client technologies and toward ever-more-capable Web clients running on today’s browsers. In particular, WebSocket brings new opportunities for efficient peer-to-peer communication, providing the basis for a new generation of interactive and “live” Web applications. This session examines the efforts under way to support WebSocket in the Java programming model, from its base-level integration in the Java Servlet and Java EE containers to a new, easy-to-use API and toolset that are destined to become part of the standard Java platform. The complete conference schedule is here: http://qconsf.com/sf2012/schedule/wednesday.jsp But wait, there’s more! At the Oracle booth, we’ll also be covering: Oracle ADF Mobile Oracle Developer Cloud Service Oracle ADF Essentials NetBeans Project Easel Hope to see you there! 

    Read the article

  • Removing 301 redirect from site root

    - by Jon Clements
    I'm having a look at a friends website (a fairly old PHP based one) which they've been advised needs re-structuring. The key points being: URLs should be lower case and more "friendly". The root of the domain should be not be re-directed. The first point I'm happy with (and the URLs needed tidying up anyway) and have a draft plan of action, however the second is baffling me as to not only the best way to do it, but also whether it should be done. Currently http://www.example.com/ is redirected to http://www.example.com/some-link-with-keywords/ using the follow index.php in the root of the Apache2 instance. <?php $nextpage = "some-link-with-keywords/"; header( "HTTP/1.1 301 Moved Permanently" ); header( "Status: 301 Moved Permanently" ); header("Location: $nextpage"); exit(0); // This is Optional but suggested, to avoid any accidental output ?> As far as I'm aware, this has been the case for around three years -- and I'm sorely tempted to advise to not worry about it. It would appear taking off the 301 could: Potentially affect page ranking (as the 'homepage' would disappear - although it couldn't disappear because of the next point...) Introduce maintainance issues as existing users would still have the re-directed page in their cache Following the above, introduce duplicate content Confuse Google/other SE's as to what the homepage actually is now I may be over-analysing this but I have a feeling it's not as simple as removing the 301 from the root, and 301'ing the previous target to the root... Any suggestions (including it's not worth it) are sincerely appreciated.

    Read the article

  • How to mange big amount users at server side?

    - by Rami
    I built a social android application in which users can see other users around them by gps location. at the beginning thing went well as i had low number of users, But now that I have increasing number of users (about 1500 +100 every day) I revealed a major problem in my design. In my Google App Engine servlet I have static HashMap that holding all the users profiles objects, currenty 1500 and this number will increase as more users register. Why I'm doing it Every user that requesting for the users around him compares his gps with other users and check if they are in his 10km radius, this happens every 5 min on average. That is why I can't get the users from db every time because GAE read/write operation quota will tare me apart. The problem with this desgin is As the number of users increased the Hashmap turns to null every 4-6 hours, I thing that this time is getting shorten but I'm not sure. I'm fixing this by reloading the users from the db every time I detect that it became null, But this causes DOS to my users for 30 sec, So I'm looking for better solution. I'm guessing that it happens because the size of the hashmap, Am I right? I have been advised to use spatial database, but that mean that I can't work with GAE any more and that mean that I need to build my big server all over again and lose my existing DB. Is there something I can do with the existing tools? Thanks.

    Read the article

  • JavaOne 2012 Conference Preview

    - by Janice J. Heiss
    A new article, by noted freelancer Steve Meloan, now up on otn/java, titled “JavaOne 2012 Conference Preview,” looks ahead to the fast approaching JavaOne 2012 Conference, scheduled for September 30-October 4 in San Francisco. The Conference will celebrate and highlight one of the world’s leading technologies. As Meloan states, “With 9 million Java developers worldwide, 5 billion Java cards in use, 3 billion mobile phones running Java, 1 billion Java downloads each year, and 100 percent of Blu-ray disk players and 97 percent of enterprise desktops running Java, Java is a technology that literally permeates our world.”The 2012 JavaOne is organized under seven technical tracks:* Core Java Platform* Development Tools and Techniques* Emerging Languages on the JVM* Enterprise Service Architectures and the Cloud* Java EE Web Profile and Platform Technologies* Java ME, Java Card, Embedded, and Devices* JavaFX and Rich User ExperiencesConference keynotes will lay out the Java roadmap. For the Sunday keynote, such Oracle luminaries as Cameron Purdy, Vice President of Development; Nandini Ramani, Vice President of Engineering, Java Client and Mobile Platforms; Richard Bair, Chief Architect, Client Java Platform; and Mark Reinhold, Chief Architect, Java Platform will be presenting.For the Thursday IBM keynote, Jason McGee, Distinguished Engineer and Chief Architect for IBM PureApplication System, and John Duimovich, Java CTO and IBM Distinguished Engineer, will explore Java and IBM's cloud-based initiatives.All in all, the JavaOne 2012 Conference should be as exciting as ever.Link to the article here. Originally published on blogs.oracle.com/javaone.

    Read the article

  • JavaOne 2012 Conference Preview

    - by Janice J. Heiss
    A new article, by noted freelancer Steve Meloan, now up on otn/java, titled “JavaOne 2012 Conference Preview,” looks ahead to the fast approaching JavaOne 2012 Conference, scheduled for September 30-October 4 in San Francisco. The Conference will celebrate and highlight one of the world’s leading technologies. As Meloan states, “With 9 million Java developers worldwide, 5 billion Java cards in use, 3 billion mobile phones running Java, 1 billion Java downloads each year, and 100 percent of Blu-ray disk players and 97 percent of enterprise desktops running Java, Java is a technology that literally permeates our world.” The 2012 JavaOne is organized under seven technical tracks: * Core Java Platform* Development Tools and Techniques* Emerging Languages on the JVM* Enterprise Service Architectures and the Cloud* Java EE Web Profile and Platform Technologies* Java ME, Java Card, Embedded, and Devices* JavaFX and Rich User Experiences Conference keynotes will lay out the Java roadmap. For the Sunday keynote, such Oracle luminaries as Cameron Purdy, Vice President of Development; Nandini Ramani, Vice President of Engineering, Java Client and Mobile Platforms; Richard Bair, Chief Architect, Client Java Platform; and Mark Reinhold, Chief Architect, Java Platform will be presenting. For the Thursday IBM keynote, Jason McGee, Distinguished Engineer and Chief Architect for IBM PureApplication System, and John Duimovich, Java CTO and IBM Distinguished Engineer, will explore Java and IBM's cloud-based initiatives.All in all, the JavaOne 2012 Conference should be as exciting as ever. Link to the article here.

    Read the article

  • Laggy Graphics After Upgrade to Ubuntu 13.10

    - by John bracciano
    I noticed some issues concerning the graphics after I upgraded from Ubuntu 13.04 to 13.10. More specifically: Dash tends to be more "laggy" than it used to be. When I switch desktops you can clearly see the screen going frame by frame. Menu bar is also "laggy". When I open GIMP, the menu bar is useless after a while - it takes seconds to refresh. The problems seem to appear after some seconds of use. When I boot the system everything works fine, until some more graphics-intensive program starts (Firefox, GIMP, etc). /usr/lib/nux/unity_support_test -p reports: OpenGL vendor string: Intel Open Source Technology Center OpenGL renderer string: Mesa DRI Intel(R) Ivybridge Mobile OpenGL version string: 3.0 Mesa 9.2.1 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: yes I use an Asus Zenbook UX31A with: Intel® Core i5 Ivy Bridge Integrated Intel® HD Graphics 4000 Intel® HM76 Express Chipset Thank you in advance for any answer!

    Read the article

  • Personalized Pricing

    - by David Dorf
    In past postings I've spent a fair amount of time talking about targeted promotions.  Using a complete view of the customer that includes purchase history, location history, and psychographics gleaned from social media, we can select the offer with the greatest chance of redemption.  This is done to influence shopping behavior, which might be introducing the consumer to a new product line, increasing their basket size, increasing frequency of purchases, etc. Safeway seems to be taking a slightly different approach with their personalized pricing.  In additional to offering electronic coupons and club card offers, they are also providing a personalized price for certain items based on purchase history.  So when Sally want to shop at Safeway, she first checks the "Just for U" website for three types of deals.  She starts by selecting manufacturer coupons to load into her loyalty card, then she checks the Club Card for offers like "buy one get one free." The third step is the interesting one.  Safeway will set a particular lower price for Sally good for 90 days on items she buys often.  Clearly this isn't enforcing a new behavior but rather instilling loyalty.  I would love to know exactly how they are determining the personalized price.  Of course bargain hunters can still stack the three offers so they can, for example, get their $4.99 Oatmeal for $0.72. I like this particular question and answer from their website's FAQ: My offers are not that great. Can I tell you what offers I need? That's a good idea. That functionality is not currently available, but we appreciate your input and are constantly improving our just for U program. Stay tuned for exciting enhancements! I suppose if Safeway is tracking all the purchases, they can easily determine whether the customer if profitable.  As long as the customer stays profitable, why not let them determine a few offers themselves?  Food for thought.

    Read the article

  • Today @ OOW: Identity Management for the SoMoClo world

    - by B Shashikumar
    Today at OpenWord, we have a very interesting lineup of Identity Management sessions that discuss how to extend identity management securrley to cloud, mobile and social ecosystems. Here are 3 of the can’t miss identity management sessions today: Identity Management and the Cloud: Security is regularly identified as the #1 barrier to cloud service adoption. Oracle Identity Management is designed to help customers extend and connect core identity services to SaaS applications and systems. This session explores how organizations are using Oracle Identity Management with cloud services and how some customers are offering identity management as a cloud service. Real-time External Authorization for Applications, Middleware and Databases: Externalization of authorization is key to manageability and audit. This session covers enterprise wide authorization solution deployment best practices and real-world examples of using Oracle Entitlements Server—the one-stop standards-compliant authorization solution—for middleware, applications, and data. Delivering Secure WiFi on the Tube as an Olympics Legacy from London 2012: In this session, Virgin Media, the U.K.’s first combined provider of broadband, TV, mobile, and home phone services, shares how it is providing free secure Wi-Fi services to the London Underground, using Oracle Virtual Directory and Oracle Entitlements Server, leveraging back-end legacy systems that were never designed to be externalized. As an Olympics 2012 legacy, the Oracle architecture will form a platform to be consumed by other Virgin Media services such as video on demand. Here is the complete lineup of Identity Management sessions today at OOW.

    Read the article

  • Ubuntu Desktop does not load

    - by Niklas
    If I login on my Ubuntu 14.04, I get the following desktop: This weird behavior appeared after I executed sudo apt-get update && sudo apt-get upgrade and restarted my computer. Don't know why though. To my Ubuntu I have tried the following (nothing seems to work so far) Fix any broken packages: sudo apt-get update sudo apt-get autoclean sudo apt-get clean sudo apt-get autoremove Locate any broken packages and reinstall them: sudo apt-get install debsums sudo apt-get clean sudo debsums_init sudo debsums -cs sudo apt-get install --reinstall $(sudo dpkg -S $(sudo debsums -c) | cut -d : -f 1 | sort -u) Removing some compiz files: rm -r ~/.cache/compizconfig-1 rm -r ~/.compiz Purging of NVIDIA and installing NVIDIA-prime: sudo apt-get install --reinstall ubuntu-desktop sudo apt-get install unity sudo apt-get purge nvidia* bumblebee* sudo apt-get install nvidia-prime sudo shutdown -r now Compizconfig Settings Manager: sudo apt-get install compizconfig-settings-manager export DISPLAY=:0 ccsm // Back to UI and enablement of Unity Plugin Unity replace, which stopped at a while and did nothing afterwards unity --replace Some dconf reset dconf reset -f /org/compiz/ unity --reset-icons &disown Actually dconf did not work and I got this error: error: Cannot autolaunch D-Bus without X11 $DISPLAY Can anybody help me on that? This is my hardware (hope it helps in any way): Intel® Core™ i7-3770 ASUS GTX660TI-DC2-OG-2GD5 (NVIDIA driver is/was installed) ASUS P8Z77-V LX Corsair DIMM 8 GB DDR3-1600 Kit Samsung 830series 2,5" 256 GB (Windows is installed here) Seagate ST31000524AS 1 TB (3/4 are reserved for files; 1/4 is for Ubuntu (16GB swap included))

    Read the article

  • Why doesn't my IDE do background compiling/building?

    - by MKO
    Today I develop on a fairly complex computer, it has multiple cores, SSD drives and what not. Still, most of the time I'm programming the computer is leasurely doing nothing. When I need to compile and run/deploy a somewhat complex project at best it still takes a couple of seconds. Why? Now that we're living more and more in the "age of instant" why can't I press F5 in Visual studio and launch/deploy my application instantly? A couple of seconds might not sound so bad but it's still cognitive friction and time that adds up, and frankly it makes programming less fun. So how could compilation be instant? Well, People tend to edit files in different assemblies, what if Visual Studio/The IDE constantly did compilation/and building of everything that I modified anytime that it might be appropriate. Heck if they wanted to go really advanced they could do per-class compilation. The compilation might not work but then it could just silently do nothing (except adding error messages to the error window). Surely todays computer could dedicate a core or two to this task, and if someone found it annoying it could be disabled by option. I know there's probably a thousand technical issues and some fancy shadow copying that would need to be resolved for this to be seamless and practical but it sure would make programming more seamless. Is there any practical reason why this scenario isn't possible? Would the wear and tear of continually writing binaries be too much? Couldn't assemblies be held in memory until deployed/run?

    Read the article

  • cpufreq-selector, cpufreq-info reporting wrong max speed

    - by dty
    Hi, I've just built a new machine with a Core i5-760 CPU. Max speed is 2.80GHz (turbo mode notwithstanding). I've also done a vanilla install of Ubuntu 10.10. I've added the cpufreq-selector applet to the top panel, and its menus only allow me to select up to 2.39GHz. If I select the "performance" governor, it also shows 2.39GHz. cpufreq-info reports: $ cpufreq-info cpufrequtils 007: cpufreq-info (C) Dominik Brodowski 2004-2009 Report errors and bugs to [email protected], please. analyzing CPU 0: driver: acpi-cpufreq CPUs which run at the same hardware frequency: 0 1 2 3 CPUs which need to have their frequency coordinated by software: 0 maximum transition latency: 10.0 us. hardware limits: 1.20 GHz - 2.39 GHz available frequency steps: 2.39 GHz, 2.26 GHz, 2.13 GHz, 2.00 GHz, 1.86 GHz, 1.73 GHz, 1.60 GHz, 1.46 GHz, 1.33 GHz, 1.20 GHz available cpufreq governors: conservative, ondemand, userspace, powersave, performance current policy: frequency should be within 1.20 GHz and 2.39 GHz. The governor "ondemand" may decide which speed to use within this range. current CPU frequency is 1.20 GHz. cpufreq stats: 2.39 GHz:13.74%, 2.26 GHz:0.09%, 2.13 GHz:0.08%, 2.00 GHz:0.08%, 1.86 GHz:0.07%, 1.73 GHz:0.07%, 1.60 GHz:0.08%, 1.46 GHz:0.08%, 1.33 GHz:0.11%, 1.20 GHz:85.61% (15560) [...CPUs 1-3 elided, but similar...] Any idea how to get Ubuntu and the various tools to recognise this as a 2.80GHz processor? And ideally to report the actual speed when running in turbo mode too, but that's not critical. Edit: I should probably add that the BIOS (& Windows) are quite happy that it's a 2.80GHz CPU. Thanks.

    Read the article

  • O'Reilly deals to April 5, 2012 14:00 PT on books on "where"

    - by TATWORTH
    At http://shop.oreilly.com/category/deals/where-conference.do, O'Reilly are offering a series of books on geo-location at 50% off until April 5, 2012 14:00 PT. HTML5 Geolocation Truly revolutionary: now you can write geolocation applications directly in the browser, rather than develop native apps for particular devices. This concise book demonstrates the W3C Geolocation API in action, with code and examples to help you build HTML5 apps using the "write once, deploy everywhere" model. Along the way, you get a crash course in geolocation, browser support, and ways to integrate the API with common geo tools like Google Maps. HTML5 Cookbook With scores of practical recipes you can use in your projects right away, this cookbook helps you gain hands-on experience with HTML5’s versatile collection of elements. You get clear solutions for handling issues with everything from markup semantics, web forms, and audio and video elements to related technologies such as geolocation and rich JavaScript APIs. Each informative recipe includes sample code and a detailed discussion on why and how the solution works. Perfect for intermediate to advanced web and mobile web developers, this handy book lets you choose the HTML5 features that work for you—and helps you experiment with the rest. HTML5 Applications HTML5 is not just a replacement for plugins. It also makes the Web a first-class development environment by giving JavaScript programmers a solid foundation for building industrial-strength applications. This practical guide takes you beyond simple site creation and shows you how to build self-contained HTML5 applications that can run on mobile devices and compete with desktop apps. You’ll learn powerful JavaScript tools for exploiting HTML5 elements, and discover new methods for working with data, such as offline storage and multi-threaded processing. Complete with code samples, this book is ideal for experienced JavaScript and mobile developers alike. There are also other books being offered at a discount at http://shop.oreilly.com/category/deals/where-conference.do

    Read the article

  • Google reverse an analytic

    - by Dan
    I am confused about what code must be executed to reverse a google analytic. I have the following code pasted within a test page: <body onLoad=”function()”> <script type="text/javascript"> var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-25305776-3']); _gaq.push(['_trackPageview']); _gaq.push(['_addTrans', '11455', // order ID - required '-42.38', // total - required '-2.38', // tax '-15.00' // shipping ]); _gaq.push(['_addItem', '11455', // order ID - necessary to associate item with transaction 'Evan Turner Turningpoint™ Basketball Pants', // product name '25.00', // unit price - required '-1' // quantity - required ]); _gaq.push(['_trackTrans']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); </script> Is this correct? Thanks!

    Read the article

  • Oracle Business Intelligence Applications 10g Bootcamp

    - by mseika
    Oracle Business Intelligence Applications 10g Bootcamp 12th - 15th February 2012, Reading (UK) The Oracle Business Intelligence Applications offer out-of-the-box integration with Siebel CRM and Oracle eBusiness Suite and provide pre-built Operational BI solutions for eBusiness Suite, Peoplesoft, Siebel, and SAP. This training will provide attendees with an in-depth working understanding of the architecture, the technical and the functional content of the Oracle Business Intelligence Applications, whilst also providing an understanding of their installation, configuration and extension. The course will cover the following topics:• Overview of Oracle Business Intelligence Applications• Oracle BI Applications Fundamentals and Features• Configuring BI Applications for Oracle E-Business Suite• Understanding BI Applications Architecture• Fundamentals of BI Applications Security REGISTER NOW Partner Registration Guide Price: FREE Cookham RoomOracle Corporation UK LtdOracle ParkwayThames Valley ParkReading, Berkshire RG6 1RA12th - 15th February 20129:30 am – 5:00 pm BST AudienceThe seminar is aimed at BI Consultants and Implementation Consultants within Oracle's Gold and Platinum Partners. Prerequisites• Good understanding of basic data warehousing concepts• Hands on experience in Oracle Business Intelligence Enterprise Edition• Hands on experience in Informatica• Some understanding of Oracle BI Applications is required (See Sales & Technical Tutorials for OBI, BI-Apps and Hyperion EPM) • Good understanding of any of the following Oracle EBS modules: General Ledger, Accounts Receivables, Accounts Payables System Requirements Please note that attendees are required to have a laptop. Laptop• 4GB RAM-Recognized by Windows 64 bits• 80GB free space in Hard drive or External Device• CPU Core 2 Duo or HigherOperating System Requirements• Windows 7, Windows XP, Windows 2003• NOT ALLOWED with Windows Vista• An Administrator User For more information please contact [email protected].

    Read the article

  • Why nautilus quicklist is not working?

    - by jasmines
    None of my bookmarks (Documents, Pictures, Download, Dropbox, Ubuntu One, Music, Public) are correctly shown but they won't open if I right click on the Home icon and select them. The only ones who work are Home and Open A New Window. I've read similar questions (http://askubuntu.com/questions/184504/unity-home-quicklist-not-working-when-nautilus-is-closed and unity home quicklist not working) but my problem seems different... Anyway I can't solve with the suggested workarounds. $ ls ~ Audiobooks Dropbox Modelli Pubblici Video Backup dvdrip-data Musica Scaricati VirtualBox VMs deja-dup grive Pictures - GT-I9100 Scrivania virtual-drives Documenti Immagini Podcasts Ubuntu One Vuze Downloads $ cat /usr/share/applications/nautilus.desktop [Desktop Entry] Name=Files Comment=Access and organize files Exec=nautilus %U Icon=system-file-manager Terminal=false Type=Application StartupNotify=true OnlyShowIn=GNOME;Unity; Categories=GNOME;GTK;Utility;Core; MimeType=inode/directory;application/x-gnome-saved-search; X-GNOME-Bugzilla-Bugzilla=GNOME X-GNOME-Bugzilla-Product=nautilus X-GNOME-Bugzilla-Component=general X-GNOME-Bugzilla-Version=3.4.2 Actions=Window; X-Ubuntu-Gettext-Domain=nautilus [Desktop Action Window] Name=Open a New Window Exec=nautilus OnlyShowIn=Unity;

    Read the article

  • Suggestions required to build an ECommerce Platform

    - by Haris
    For a prospective client we have to offer a solution to provide following system: CMS Order Management Shopping Cart CRM Helpdesk Accounting & Finance Custom Functions In order to save time and to avoid reinvent the wheel our idea is to integrate different off-the-shelf solutions. Their first requirement is that the system has to be hosted in their country which I think will exclude application like Aplicor, Netsuite & Salesforce. Basically the nucleaus would be the CMS which would integrate all the other apps. PHP or .Net based solutions would be our preferences as have inhouse expertise. So far following are few combinations I have come up with: Joomla (CMS) + Virtuemart (Cart+Ordering) + Sugar CRM + Open ERP (finance) + OTRS Magento (CMS+Cart+Ordering) + Sugar CRM + Open ERP (finance) + Helpdesk Ultimate Drupal (CMS) + Ubercart (Cart+Ordering) + Sugar CRM + Open ERP (finance) + Support Ticketing System Sharepoint (CMS) + OptimusBt (Cart+Ordering) + Dynamics CRM + Great Plains + SharepointHQ Dotnetnuke (CMS) + DNNSpot (Cart+Ordering) + Sigma Pro (CRM+Helpdesk) + Open ERP For Helpdesk I liked Zendesk but the server location was the stopping factor, similar for finance and CRM I liked Aplicor. I would not like to go into detailed requirements as it would make things very complex. Could you please suggest me which options are worth enough to start looking into? What other options we have?

    Read the article

  • Oracle Financials In the News

    - by Di Seghposs
    Coming off of OpenWorld and all the excitement around Oracle’s “Cloud” strategy, we thought we’d share what others had to say recently about Oracle’s financial solutions in and out of the cloud: Information Management, the educated reader’s choice for the latest news, commentary and feature content serving the information technology and business community, had an interesting blog post from Bill McNee of Saugatuck Technology, entitled, “A Bull Market for Finance Cloud Apps”. In the post, he highlights Oracle as one of the ‘significant players’ in the space… Oracle: As recently announced, Oracle is now aggressively marketing its Oracle Fusion Financials Cloud Service to midsize and large enterprise customers. While we anticipate that this solution set will primarily appeal to a portion of the existing Oracle customer footprint, rather than taking share from competitors, it is embedding some strong mobile and social capabilities that should help it gain traction. Read the full article - “A Bull Market for Finance Cloud Apps” Ventana Research, a leading benchmark research and advisory services firm, made mention to Oracle Fusion Financials in a recent blog post. While we all know ‘boring is cool’, it was cool to see Robert Kugel, SVP Research, discussing Oracle’s Fusion Financials strategy. Here’s some excerpts: “For at least the next five years I believe Oracle has a good strategy, because the transition from the existing Oracle ERP offerings to Fusion Financials can be less painful than similar migrations…” “Deploying Fusion GL can facilitate a more consistent and faster way to execute finance department functions.” “Fusion Financials is the go-forward accounting and financial applications suite that will coexist…” “Whether or not it’s time to migrate, I think all users of Oracle’s E-Business Suite, Oracle Applications, PeopleSoft and JD Edwards software should consider Fusion GL as part of an ongoing program to extract more value from their core financial systems.” Read the full article - “Oracle Fusion Financials: Boring is Cool”

    Read the article

  • Designing web-based plugin systems correctly so they don't waste as many resources?

    - by Xeoncross
    Many CMS systems which rely on third parties for much of their code often build "plugin" or "hooks" systems to make it easy for developers to modify the codebase's actions without editing the core files. This usually means an Observer or Event design pattern. However, when you look at systems like wordpress you see that on every page they load some kind of bootstrap file from each of the plugin's folders to see if that plugin will need to run that request. Its this poor design that causes systems like wordpress to spend many extra MB's of memory loading and parsing unneeded items each page. Are there alternative ways to do this? I'm looking for ideas in building my own. For example, Is there a way to load all this once and then cache the results so that your system knows how to lazy-load plugins? In other words, the system loads a configuration file that specifies all the events that plugin wishes to tie into and then saves it for future requests? If that also performs poorly, then perhaps there is a special file-structure that could be used to make educated guesses about when certain plugins are unneeded to fullfil the request. Any ideas? If anyone wants an example of the "plugin" concept you can find one here.

    Read the article

  • XNA, how to draw two cubes standing in line parallelly?

    - by user3535716
    I just got a problem with drawing two 3D cubes standing in line. In my code, I made a cube class, and in the game1 class, I built two cubes, A on the right side, B on the left side. I also setup an FPS camera in the 3D world. The problem is if I draw cube B first(Blue), and move the camera to the left side to cube B, A(Red) is still standing in front of B, which is apparently wrong. I guess some pics can make much sense. Then, I move the camera to the other side, the situation is like: This is wrong.... From this view, the red cube, A should be behind the blue one, B.... Could somebody give me help please? This is the draw in the Cube class Matrix center = Matrix.CreateTranslation( new Vector3(-0.5f, -0.5f, -0.5f)); Matrix scale = Matrix.CreateScale(0.5f); Matrix translate = Matrix.CreateTranslation(location); effect.World = center * scale * translate; effect.View = camera.View; effect.Projection = camera.Projection; foreach (EffectPass pass in effect.CurrentTechnique.Passes) { pass.Apply(); device.SetVertexBuffer(cubeBuffer); RasterizerState rs = new RasterizerState(); rs.CullMode = CullMode.None; rs.FillMode = FillMode.Solid; device.RasterizerState = rs; device.DrawPrimitives( PrimitiveType.TriangleList, 0, cubeBuffer.VertexCount / 3); } This is the Draw method in game1 A.Draw(camera, effect); B.Draw(camera, effect); **

    Read the article

  • Ubuntu unstable, showing awkward behavior

    - by Christophe De Troyer
    Let me start off by saying that this problem can't be described in such a way that allows me to find other topics, which have some relevance to this problem. That's why I created this question. In case this question might have been asked before, I apologize. So what is the problem: My computer (Intel Core I5 2500K - with HD3000 graphics -, 6 gb DDR, 1 SSD, 3 HDD's and an Asus P8Z68 mobo) runs Windows on the SDD. But I decided to give Ubuntu a chance to be my daily OS for basic needs since it's open source and I find it a handicap of not being able to work with it. I decided to run the windows installer and install Ubuntu 12.04 to my 320 Gb hdd which was not being used in my computer. After installing it and booting it, it worked great! I spent the rest of the day/night using it, and falling for it. Great, today I booted Ubuntu (I had the choice in the bootloader as I expected). It asked me for my login and it started logging in. Now, after a few (literally) minutes of letting it "boot" I tried determining the cause of this. What I've figured out so far: When I left click on my desktop it freezes completely for a few seconds I have something like tearing in the left side menu (in games, you know) when I move my mouse around It runs well when just hovering around with my mouse, but from the point I click on something it freezes. What have I tried? I ran a HD Tune diagnostic on the HDD but the performance seems to be very close to the stock values, so I'm taking it as a good HDD. I'm trying to get to the drivers update panel for Ubuntu, but with the state it's in, it's taking a lot of time.. Could anyone point me in a direction for troubleshooting this? I'm not really a noob at all, just when using Linux.. :) Thanks in advance! Christophe,

    Read the article

  • For a Javascript library, what is the best or standard way to support extensibility

    - by Michael Best
    Specifically, I want to support "plugins" that modify the behavior of parts of the library. I couldn't find much information on the web about this subject. But here are my ideas for how a library could be extensible. The library exports an object with both public and "protected" functions. A plugin can replace any of those functions, thus modifying the library's behavior. Advantages of this method are that it's simple and that the plugin's functions can have full access to the library's "protected" functions. Disadvantages are that the library may be harder to maintain with a larger set of exposed functions and it could be hard to debug if multiple plugins are involved (how to know which plugin modified which function?). The library provides an "add plugin" function that accepts an object with a specific interface. Internally, the library will use the plugin instead of it's own code if appropriate. With this method, the internals of the library can be rearranged more freely as long as it still supports the same plugin interface. This could also support having different plugin interfaces to modify different parts of the library. A disadvantage of this method is that the plugins may have to re-implement code that is already part of the library since the library's internal functions are not exported. The library provides a "set implementation" function that accepts an object inherited from a specific base object. The library's public API calls functions in the implementation object for any functionality that can be modified and the base implementation object includes the core functionality, with both external (to the API) and internal functions. A plugin creates a new implementation object, which inherits from the base object and replaces any functions it wants to modify. This combines advantages and disadvantages of both the other methods.

    Read the article

  • How do they keep track of the NPCs in Left 4 Dead?

    - by f20k
    How do they keep track of the NPC zombies in Left 4 Dead? I am talking about the NPCs that just walk into walls or wander around aimlessly. Even though the players cannot see them, they are there (say inside rooms or behind doors). Let's say there's about 10 or so zombies in a hallway and inside rooms. Does the game keep all of those zombies in a list and iterate through giving them commands? Do they just spawn when the user is within a certain radius or reached a special location? Say you placed the 4 units (controlled by players) on completely different places throughout the map. Let's assume you aren't being swarmed and then you have not killed any of these aimless NPCs. Would the game be keeping track of 10 x 4 = 40 zombies in total? Or is my understanding completely off? The reason I ask is if I were to implement something similar on a mobile device, keeping track of 40 or more NPCs might not be such a great idea.

    Read the article

  • How can I pass referrer header from my https domain to http domains?

    - by nutcracker
    My website is 100% https. I have links to other http domains. The referrer header is not set when linking from a https page to a http page. From http://en.wikipedia.org/wiki/HTTP_referrer If a website is accessed from a HTTP Secure (HTTPS) connection and a link points to anywhere except another secure location, then the referer field is not sent. I would prefer that other domains can see the referrer so that they know that traffic comes from my domain. Is there a way to force this header or is there another solution? Update I've done some basic testing using a redirect: http page -- link to http --> 301 redirect --> http page = referrer intact https page -- link to https --> 301 redirect --> http page = referrer blank https page -- link to http --> 301 redirect --> http page = referrer blank https page -- link to http --> 302 redirect --> http page = referrer blank The referrer is lost when linking from a https page to a http redirect page on my own domain. So there is no referrer on the redirect.

    Read the article

  • Why does 301 redirect work for http but not for https?

    - by Tom G
    Through my domain registrar I have set up a domain, essayme.co.uk, to automatically forward to https://google.com. If I go to http://essayme.co.uk it works as expected and redirects me to https://google.com. $curl -i http://essayme.co.uk HTTP/1.1 301 Moved Permanently Cache-Control: max-age=900 Content-Type: text/html Location: https://google.com Server: Microsoft-IIS/7.5 X-AspNet-Version: 4.0.30319 X-Powered-By: ASP.NET Date: Sat, 07 Jun 2014 11:14:16 GMT Content-Length: 0 Age: 0 Connection: keep-alive However, if I go to https://essayme.co.uk it just freezes and times out. $curl -i https://essayme.co.uk curl: (7) Failed connect to essayme.co.uk:443; Operation timed out What is happening in the second case? (and, if possible, how can I get the redirect to work for https?) Problem background/clarification: I don't have an SSL certificate for the essayme.co.uk domain above, but I do for my live domain (let's call it mywebsite.com), and I was seeing the exact same problem on this domain (hence why I'm trying to debug the problem). Unfortunately I can't experiment with the live domain (as it's live) and I would like to avoid having to buy a second certificate for essayme.co.uk just for debugging (unless absolutely necessary). The problem I was seeing: my live domain, mywebsite.com (not its real name), has a valid SSL certificate. Visiting https://www.mywebsite.com displayed the webpage as expected. I had set up forwarding (like in the question above) from the naked domain (mywebsite.com) to https://www.mywebsite.com) Visiting http://mywebsite.com redirected to https://www.mywebsite.com as expected. However, visiting https://mywebsite.com would freeze and time out (as in the question above). I also tried forwarding it to http://www.otherwebsite.com as an experiment (i.e. forwarding to another site that does not use SSL), but the result was the same: Visiting http://mywebsite.com redirected to http://www.otherwebsite.com as expected. Visiting https://mywebsite.com would freeze and time out again. So I set up essayme.co.uk as an experiment to try and understand why it doesn't work.

    Read the article

< Previous Page | 463 464 465 466 467 468 469 470 471 472 473 474  | Next Page >