Search Results

Search found 22413 results on 897 pages for 'train main'.

Page 388/897 | < Previous Page | 384 385 386 387 388 389 390 391 392 393 394 395  | Next Page >

  • Choosing between a dedicated and virtual dedicated server for my startup

    - by MarathonStudios
    I'm about to launch a startup site I've been working on for some time, and I'm just now looking over hosting plans. The site's main feature is fairly processor-heavy (a lot of text processing), so I probably need something other then shared hosting to ensure I don't get shut down for overusing resources. I would like to spend as little as possible on hosting until the site starts generating income, so under-$60/mo is my goal. One caveat is that I need a Windows box for this particular site, so it's harder to get a good deal. For that price, I can either get a bottom-tier dedicated (2gigs ram, pentium 4) or a middle-tier VPS (3gb RAM, a bit more traffic and HD) for a few bucks more per month. I had a bad experience with a low-end VPS a few months ago, so making sure that whatever I get can handle the basic traffic of a website as well as giving me what I need (extra processing power) is essential. Do you have any suggestions as to which way to go, or a certain hosting company you've worked with that you can recommend?

    Read the article

  • Can I completely remove the Windows DNS in favour of BIND9 in an AD network?

    - by Vinícius Ferrão
    I would like to remove the DNS feature of Windows Domain Controllers and point the DNS servers to our BIND9 servers. I know it's possible to setup coexistence but this requires a number of extra Windows DNS Servers equals to the number of Domain Controllers in the network. Active Directory expects the _msdcs zone and other things like _tcp, _udp; etc. The main question is: how to make BIND9 takes care of all this AD specific data? And with dynamic updating to make AD even more happier. Thanks, PS: Making BIND9 points to the Windows DNS Servers to resolve the Active Directory specific zones isn't an option. We already do this... EDIT: As today, I'm running without Windows DNS. I'm writing up a guide on how to do this, and I'll update this topic.

    Read the article

  • OpenVZ multiple networks on CTs

    - by picca
    I have Hardware Node (HN) which has 2 physical interfaces (eth0, eth1). I'm playing with OpenVZ and want to let my containers (CTs) have access to both of those interfaces. I'm using basic configuration - venet. CTs are fine to access eth0 (public interface). But I can't get CTs to get access to eth1 (private network). I tried: # on HN vzctl set 101 --ipadd 192.168.1.101 --save vzctl enter 101 ping 192.168.1.2 # no response here ifconfig # on CT returns lo (127.0.0.1), venet0 (127.0.0.1), venet0:0 (95.168.xxx.xxx), venet0:1 (192.168.1.101) I believe that the main problem is that all packets flows through eth0 on HN (figured out using tcpdump). So the problem might be in routes on HN. Or is my logic here all wrong? I just need access to both interfaces (networks) on HN from CTs. Nothing complicated.

    Read the article

  • BUILD 2013 Session&ndash;Testing Your C# Base Windows Store Apps

    - by Tim Murphy
    Originally posted on: http://geekswithblogs.net/tmurphy/archive/2013/06/27/build-2013-sessionndashtesting-your-c-base-windows-store-apps.aspx Testing an application is not what most people consider fun and the number of situation that need to be tested seems to grow exponentially when building mobile apps.  That is why I found the topic of this session interesting.  When I found out that the speaker, Francis Cheung, was from the Patterns and Practices group I knew I was in the right place.  I have admired that team since I first met Ron Jacobs around 2001.  So what did Francis have to offer? He started off in a rather confusing who’s on first fashion.  It seems that one of his tester was originally supposed to give the talk, but then it was decided that it would be better to have someone who does development present a testing topic.  This didn’t hinder the content of the talk in the least.  He broke the process down in a logical manner that would be straight forward to understand if not implement. Francis hit the main areas we usually think of such as tombstoning, network connectivity and asynchronous code, but he approached them with tools they we may not have thought of until now.  He relied heavily on Fiddler to intercept and change the behavior of network requests. Then there are the areas you might not normal think to check.  This includes localization, accessibility and updating client code to a new version.  These are important aspects of your app that can severely impact how customers feel about your app.  Take the time to view this session and get a new appreciation for testing and where it fits in your development lifecycle. del.icio.us Tags: BUILD 2013,Testing,C#,Windows Store Apps,Fiddler

    Read the article

  • Announcing Monthly Silverlight Giveaways on MichaelCrump.Net

    - by mbcrump
    I've been working with different companies to give away a set of Silverlight controls to my blog readers for the past few months. I’ve decided to setup this page and a friendly URL for everyone to keep track of my monthly giveaway. The URL to bookmark is: http://giveaways.michaelcrump.net. My main goal for giving away the controls is: I love for the Silverlight Community and giving away nice controls always sparks interest in Silverlight. To spread the word about a company and let the user decide if it meet their particular situation. I am a “control” junkie, it helps me solve my own personal business problems since I know what is out there. Provide some additional hits for my blog. Below is a grid that I will update monthly with the current giveaway. Feel free to bookmark this post and visit it monthly. Month Name Giveaway Description Blog Post Detailing Controls October 2010 Telerik Silverlight Controls –RadControls http://michaelcrump.net/archive/2010/10/15/win-telerik-radcontrols-for-silverlight-799-value.aspx November 2010 Mindscape Mindscape Mega-Pack http://michaelcrump.net/archive/2010/11/11/mindscape-silverlight-controls--free-mega-pack-contest.aspx December 2010 Infragistics Silverlight Controls + Silverlight Data Visualization http://michaelcrump.net/mbcrump/archive/2010/12/15/win-a-set-of-infragistics-silverlight-controls-with-data-visualization.aspx January 2011 Cellbi Cellbi Silverlight Controls http://michaelcrump.net/mbcrump/archive/2011/01/05/cellbi-silverlight-controls-giveaway-5-license-to-give-away.aspx February 2011 *BOOKED*     To Third Party Silverlight Companies: If you create any type of Silverlight control/application and would like to feature it on my blog then you may contact me at michael[at]michaelcrump[dot]net. Giving away controls has proven to be beneficial for both parties as I have around 4k twitter followers and average around 1000 page views a day. Contact me today and give back to the Silverlight Community.  Subscribe to my feed

    Read the article

  • Creating cookieless application on development machine with asp.net

    - by zaladane
    I am thinking about setting up a new domain to host static content on my website and have it cookieless just like Stackoverflow with their static domain. So before going ahead and buying the domain and setting it up I wanted to test it on my developement machine first under localhost (I have to mention that i am planning on having IIS running on my new domain for the static files). I therefore created a new application under IIS and disabled session state and forms authentication. When my main application needs resources like css, images and js , I use the path to the "static" application where they are hosted. The problem is that when I look at the request and the response for the requested files, they still have the session_id cookie defined as well as the asp.net authentication cookie. Is it at all possible to accomplish what i am trying to do on a development machine or do i have to just go ahead and purchase the new domain which hopefully with make things right? I tried to read about cookieless domain but can't figure out what i might be missing.

    Read the article

  • MAC and PC problems on home network

    - by tombull89
    Hello! At home we have a wireless router that my family want to use. We have our main computer physically connected to the router, and my laptop is connected wirelessly. When the network is like this then it is faultless. However, when my brother introduces is Apple MAC into the equation, both my laptop and the family machine gets all sorts of problems, primarily long load times and timeouts. The MAC, however, works fine. I think I've read something here or SF about a MAC continusly pinging a router which times it out, but I've not found any solution so far. Router: Belkin F5D7634uk4A-H Home Computer: XP SP3 My Laptop: Windows 7, Ultimate Mac: 13" Macbook Pro, Snow Lepoard

    Read the article

  • Technical development decision for my newly established software company

    - by test test
    I have a new software company where I am planning to develop CRM system. So I have settled down on the technological approach I am going to use:- I will use an open source Java-based CRM engine. I will use a third party reporting tool named JasperReports for providing reports capabilities for the CRM. I will develop the interface and any customization which the customer might ask for using asp.net mvc framework since my knowledge and experience are based on asp.net. And I will use the CRM API to integrate my asp.net web application with the Java-based CRM. I have developed a simple demo which integrate these three main components (CRM engine, asp.net application and the reporting tool) and they worked well. But I am afraid of the following risk that I might face if I go with the above approach: I should hire developers with different skills and experience: Developers with Java skills to be able to modify the Java-based CRM and writing plug-ins -when needed- to extend the CRM capabilities. Other developers with asp.net skills to be able to build the application such as application forms, the portal from where users will be able to start the CRM processes, searching capabilities, etc. So might the above point raise some risks when I start hiring a new team and start building the CRM application, OR I am on the right track at this early stage?

    Read the article

  • Temporarily disable an AD server

    - by 3molo
    Topology and setup We have main office A, and branch office (abroad) B. Our ISP somehow messed up the MPLS, and office A<B will not be connected until a few days. At location B, we have an AD (and the other two ADs at location A). Location A also have an exchange server. The problems A few users at A have problem to login to their computers running Windows XP, the logon process kind of hangs where "Applying computer policies". Additionally, I can't start the Exchange management shell, it fails on get-recipient because the AD abroad (location B) is unreachable. Solution? I could delete the AD at B, but Im pretty sure it will be a hazzle to re-join it, and since the office is abroad it's not an option to just go there and re-install it - re join, I now wonder how I in location As primary and secondary ADs can temporarily disable AD at location B.

    Read the article

  • Partner Infoline & Service Portal

    - by uwes
    As an EMEA-wide team we're supporting the daily work of our partners. Our team consists of 24 sales consultants, one third is specialized on the Partner Infoline. Partner Infoline's main focus is to deliver actively and reactively technical pre sales knowledge about the Oracle hardware portfolio to our partners.With infoline we assist our partners in their daily work, furthermore we help to educate our partners to be self sufficient in all aspects and questions about hardware configurations and hardware quotes. For our Infoline Service we use a ticketing system called Service Portal which is widely used within Oracle and delivers a good and stable functionality and availability. Our Infoline-Service provides answers to questions concerning technical pre-sales matters that are related to hardware and the corresponding hardware related software.* You can address these types of questions by sending them to our mailing list: [email protected] The serviceportal will send you an auto-reply including a unique reference number, which will be the identification for your request until it is closed. Depending on the complexity of the request, it might be necessary to forward it to our specialists (servers, storage, tape, Solaris etc.) located whole over Europe. In order to make the whole process smooth here are some recommendations: write your request in English; saves translation-time, when it has to be forwarded to the specialists stating clearly in the title your interest area, like for example "memory in M4000 server". one request/one subject; makes it easier to maintain and keep the correspondence clear and simple. The rule of the service is to provide an answer quick, which means the vast majority of the requests are answered within a couple of hours. However please keep in mind that some requests may need extra work by involving the appropriate person within Europe or even in US. Therefore there is no official SLA for this service. * This excludes Oracle "classic" products and post-sales support. The latter should still be addressed through MOS (http://support.oracle.com)

    Read the article

  • Microsoft&rsquo;s new technical computing initiative

    - by Randy Walker
    I made a mental note from earlier in the year.  Microsoft literally buys computers by the truckload.  From what I understand, it’s a typical practice amongst large software vendors.  You plug a few wires in, you test it, and you instantly have mega tera tera flops (don’t hold me to that number).  Microsoft has been trying to plug away at their cloud services (named Azure).  Which, for the layman, means Microsoft runs your software on their computers, and as demand increases you can allocate more computing power on the fly. With this in mind, it doesn’t surprise me that I was recently sent an executive email concerning Microsoft’s new technical computing initiative.  I find it to be a great marketing idea with actual substance behind their real work.  From the programmer academic perspective, in college we dreamed about this type of processing power.  This has decades of computer science theory behind it. A copy of the email received.  (note that I almost deleted this email, thinking it was spam due to it’s length) We don't often think about how complex life really is. Take the relatively simple task of commuting to and from work: it is, in fact, a complicated interplay of variables such as weather, train delays, accidents, traffic patterns, road construction, etc. You can however, take steps to shorten your commute - using a good, predictive understanding of a few of these variables. In fact, you probably are already taking these inputs and instinctively building a predictive model that you act on daily to get to your destination more quickly. Now, when we apply the same method to very complex tasks, this modeling approach becomes much more challenging. Recent world events clearly demonstrated our inability to process vast amounts of information and variables that would have helped to more accurately predict the behavior of global financial markets or the occurrence and impact of a volcano eruption in Iceland. To make sense of issues like these, researchers, engineers and analysts create computer models of the almost infinite number of possible interactions in complex systems. But, they need increasingly more sophisticated computer models to better understand how the world behaves and to make fact-based predictions about the future. And, to do this, it requires a tremendous amount of computing power to process and examine the massive data deluge from cameras, digital sensors and precision instruments of all kinds. This is the key to creating more accurate and realistic models that expose the hidden meaning of data, which gives us the kind of insight we need to solve a myriad of challenges. We have made great strides in our ability to build these kinds of computer models, and yet they are still too difficult, expensive and time consuming to manage. Today, even the most complicated data-rich simulations cannot fully capture all of the intricacies and dependencies of the systems they are trying to model. That is why, across the scientific and engineering world, it is so hard to say with any certainty when or where the next volcano will erupt and what flight patterns it might affect, or to more accurately predict something like a global flu pandemic. So far, we just cannot collect, correlate and compute enough data to create an accurate forecast of the real world. But this is about to change. Innovations in technology are transforming our ability to measure, monitor and model how the world behaves. The implication for scientific research is profound, and it will transform the way we tackle global challenges like health care and climate change. It will also have a huge impact on engineering and business, delivering breakthroughs that could lead to the creation of new products, new businesses and even new industries. Because you are a subscriber to executive e-mails from Microsoft, I want you to be the first to know about a new effort focused specifically on empowering millions of the world's smartest problem solvers. Today, I am happy to introduce Microsoft's Technical Computing initiative. Our goal is to unleash the power of pervasive, accurate, real-time modeling to help people and organizations achieve their objectives and realize their potential. We are bringing together some of the brightest minds in the technical computing community across industry, academia and science at www.modelingtheworld.com to discuss trends, challenges and shared opportunities. New advances provide the foundation for tools and applications that will make technical computing more affordable and accessible where mathematical and computational principles are applied to solve practical problems. One day soon, complicated tasks like building a sophisticated computer model that would typically take a team of advanced software programmers months to build and days to run, will be accomplished in a single afternoon by a scientist, engineer or analyst working at the PC on their desktop. And as technology continues to advance, these models will become more complete and accurate in the way they represent the world. This will speed our ability to test new ideas, improve processes and advance our understanding of systems. Our technical computing initiative reflects the best of Microsoft's heritage. Ever since Bill Gates articulated the then far-fetched vision of "a computer on every desktop" in the early 1980's, Microsoft has been at the forefront of expanding the power and reach of computing to benefit the world. As someone who worked closely with Bill for many years at Microsoft, I am happy to share with you that the passion behind that vision is fully alive at Microsoft and is carried out in the creation of our new Technical Computing group. Enabling more people to make better predictions We have seen the impact of making greater computing power more available firsthand through our investments in high performance computing (HPC) over the past five years. Scientists, engineers and analysts in organizations of all sizes and sectors are finding that using distributed computational power creates societal impact, fuels scientific breakthroughs and delivers competitive advantages. For example, we have seen remarkable results from some of our current customers: Malaria strikes 300,000 to 500,000 people around the world each year. To help in the effort to eradicate malaria worldwide, scientists at Intellectual Ventures use software that simulates how the disease spreads and would respond to prevention and control methods, such as vaccines and the use of bed nets. Technical computing allows researchers to model more detailed parameters for more accurate results and receive those results in less than an hour, rather than waiting a full day. Aerospace engineering firm, a.i. solutions, Inc., needed a more powerful computing platform to keep up with the increasingly complex computational needs of its customers: NASA, the Department of Defense and other government agencies planning space flights. To meet that need, it adopted technical computing. Now, a.i. solutions can produce detailed predictions and analysis of the flight dynamics of a given spacecraft, from optimal launch times and orbit determination to attitude control and navigation, up to eight times faster. This enables them to avoid mistakes in any areas that can cause a space mission to fail and potentially result in the loss of life and millions of dollars. Western & Southern Financial Group faced the challenge of running ever larger and more complex actuarial models as its number of policyholders and products grew and regulatory requirements changed. The company chose an actuarial solution that runs on technical computing technology. The solution is easy for the company's IT staff to manage and adjust to meet business needs. The new solution helps the company reduce modeling time by up to 99 percent - letting the team fine-tune its models for more accurate product pricing and financial projections. Our Technical Computing direction Collaborating closely with partners across industry and academia, we must now extend the reach of technical computing even further to help predictive modelers and data explorers make faster, more accurate predictions. As we build the Technical Computing initiative, we will invest in three core areas: Technical computing to the cloud: Microsoft will play a leading role in bringing technical computing power to scientists, engineers and analysts through the cloud. Existing high- performance computing users will benefit from the ability to augment their on-premises systems with cloud resources that enable 'just-in-time' processing. This platform will help ensure processing resources are available whenever they are needed-reliably, consistently and quickly. Simplify parallel development: Today, computers are shipping with more processing power than ever, including multiple cores, but most modern software only uses a small amount of the available processing power. Parallel programs are extremely difficult to write, test and trouble shoot. However, a consistent model for parallel programming can help more developers unlock the tremendous power in today's modern computers and enable a new generation of technical computing. We are delivering new tools to automate and simplify writing software through parallel processing from the desktop... to the cluster... to the cloud. Develop powerful new technical computing tools and applications: We know scientists, engineers and analysts are pushing common tools (i.e., spreadsheets and databases) to the limits with complex, data-intensive models. They need easy access to more computing power and simplified tools to increase the speed of their work. We are building a platform to do this. Our development efforts will yield new, easy-to-use tools and applications that automate data acquisition, modeling, simulation, visualization, workflow and collaboration. This will allow them to spend more time on their work and less time wrestling with complicated technology. Thinking bigger There is so much left to be discovered and so many questions yet to be answered in the fascinating world around us. We believe the technical computing community will show us that we have not seen anything yet. Imagine just some of the breakthroughs this community could make possible: Better predictions to help improve the understanding of pandemics, contagion and global health trends. Climate change models that predict environmental, economic and human impact, accessible in real-time during key discussions and debates. More accurate prediction of natural disasters and their impact to develop more effective emergency response plans. With an ambitious charter in hand, this new team is ready to build on our progress to-date and execute Microsoft's technical computing vision over the months and years ahead. We will steadily invest in the right technologies, tools and talent, and work to bring together the technical computing community. I invite you to visit www.modelingtheworld.com today. We welcome your ideas and feedback. I look forward to making this journey with you and others who want to answer the world's biggest questions, discover solutions to problems that seem impossible and uncover a host of new opportunities to change the world we live in for the better. Bob

    Read the article

  • links for 2010-05-06

    - by Bob Rhubart
    Podcast: Collaborate 10 Wrap-Up - Conclusion #c10 More Collaborate 2010 Las Vegas highlights and hijinks from this ten-member panel, including OAUG and ODTUG board members, members of the Oracle ACE program, and OAUG President Dave Ferguson. (tags: otn oracle collaborate2010) Peter Scott: Realtime Data Warehouse Loading Rittman-Mead's Peter Scott looks at putting data in to a data warehouse in real time. (tags: oracle datawarehousing businessintelligence) Live Webcast: Social BPM - Integrating Enterprise 2.0 with Business Applications - May 12, 2010 at 11:00 a.m. PT Business Process Management with integrated Enterprise 2.0 collaboration can improve business responsiveness and enhance overall enterprise productivity. Learn how to take your business to the next level with a unified solution that fosters process-based collaboration between employees, partners, and customers. (tags: oracle otn bpm enterprise2.0 webcast) Management Pack for Identity Management Viewlet A screencast produced by the Grid Control team showing the features of the Identity Management Pack for Grid Control 11g. Grid Control 11g now works with Oracle Virtual Directory 11g. (tags: oracle otn security identitymanagement) @pevansgreenwood: Having too much SOA is a bad thing (and what we might do about it) "The problem is usually too much flexibility, as flexibility creates complexity, and complexity exponentially increases the effort required to manage and deliver the software." -- Peter Evans-Greenwood (tags: soa complexity flexibility) @vampbenepe: Integration patterns for social data: the Open Social Data Bus "The main point is about defining the right integration pattern for social data: is it a 'message bus' pattern or a 'shared database' pattern?" -- William Vampbenepe (tags: oracle otn enterprise2.0 enterprisearchitecture)

    Read the article

  • MaxClients in apache. How to know the size of my proccess?

    - by Larry
    From http://httpd.apache.org/docs/2.2/misc/perf-tuning.html The single biggest hardware issue affecting webserver performance is RAM. A webserver should never ever have to swap, as swapping increases the latency of each request beyond a point that users consider "fast enough". This causes users to hit stop and reload, further increasing the load. You can, and should, control the MaxClients setting so that your server does not spawn so many children it starts swapping. This procedure for doing this is simple: determine the size of your average Apache process, by looking at your process list via a tool such as top, and divide this into your total available memory, leaving some room for other processes. The main issue is that I can't understand how to know the size, because, well i have the size of httpd on no more of 3888 But, if we need to determine the number for MaxClients, and I have 4GB of RAM, so I get: 972, so I should use like 900 in the MaxClients?

    Read the article

  • Mono on Linux: Apache or Nginx

    - by Furism
    Hi, I'm developing an ASP.NET application that will be run under Linux/Mono for various reasons (mostly to stay away from IIS, quite frankly). Of course the first web server I had in mind was Apache. But Apache, for all its advantages, adds a lot of overhead. Also, the application I'm building needs to be highly scalable and performance is one of the main concern. Apache has, obviously, a very good reputation and its record speaks for itself, but I don't need things like Reverse Proxy or Load Balancing because dedicated network devices would be used for that. So those modules from Apache will never be used. So basically my question is: since Nginx seems to fit exactly needs, is there any caveat I should be aware of? For instance, is Nginx renowned to be particularity safe? When security flaws are detected, how fast are they patched? Any insight on the pros and cons of using either of those servers in conjunction with Mono is welcome.

    Read the article

  • Partner Spotlight

    - by rituchhibber
    FADATA   Fadata officially became a WebCenter Content Specialized partner in the Adriatic region upon the successful completion of the corresponding Oracle specialization tests. ''Being recognized by Oracle and customers will greatly help our company in maintaining a high level of implementation services related to Oracle Web Center Content, as one of strategic products we are focused on. This certification, that our team is very proud of, will certainly help our company FADATA to gain additional advantage, competitiveness, and integrity in implementing Oracle Web Center Content solutions, both on current and future projects in the region'' according to Velimir Corovic and Marjan Nikolic from Fadata. Please put link www.fadata.bg, under Fadata Please also include logo after 1st sentence FISHBOWL SOLUTIONS Google Search Appliance Connector for Oracle WebCenter Content The Google Search Appliance (GSA) provides fast search for your intranet or website. Fishbowl Solutions provides a connector for the Google Search Appliance to integrate it with the Oracle WebCenter Content Server while retaining the security benefits of Oracle WebCenter Content. For more information and real customer example click here Fairview Health Services Case Study or Webinar recording SIGNUM Signum TTE, from Turkey and a Gold member of Oracle® PartnerNetwork (OPN), recently announced it has achieved OPN Specialized status for Oracle WebCenter Portal. Signum TTE which began operations in the IT sector in 2005, is an innovative software solution house focusing on two main issues. Signum TTE presents services and solutions on Oracle Middleware products and technologies and its own product "WinDesk Service Management".

    Read the article

  • IE8 Windows 7 (64bit) security certificate problem

    - by Steve
    Hi, We have just received some new computers for use in the office (Dell Vostro). They seem to work fine in the main. When we use IE8 to go to some web pages such as yahoo mail it tells us: “There is a problem with this websites security certificate” If we have a look at the details it says: “This certificate cannot be verified up to a trusted certification authority” This however works correctly in Firefox. I don't understand why I should get such an error message, should this not just work? I don't think its anything to do with the certificate itself as this is happening on www.yahoo.co.uk and other commercial (amazon I think?) sites. I think there is something off with the PCs setup. The PC has Windows 7 (64 bit) and Norton Internet Security installed. Any ideas as to why this is happening? Thanks

    Read the article

  • The Future of Life Assurance Conference Recap

    - by [email protected]
    I recently wrote about the Life Insurance Conference held in Washington, DC last month. This week I was both an attendee and guest speaker the 13th Annual Future of Life Assurance Conference held at The Guoman Tower in London, UK. It's amazing that these two conferences were held on opposition sides of the Atlantic Ocean and addressed many of the same session topics and themes. Insurance is certainly a global industry! This year's conference was attended by many of the leading carriers and CEOs in the UK and across Europe.The sessions included a strong lineup of keynote speakers and panel discussions from carriers such as Legal & General, Skandia, Aviva, Standard Life, Friends Provident, LV=, Zurich UK, Barclays and Scottish Life. Sessions topics addressed a variety of business and regulatory issues including: Ensuring a profitable future Key priorities in regulation The future of advice The impact of the RDR on distribution Bancassurance Gaining control of the customer relationship Revitalizing product offerings In addition, Oracle speakers (Glenn Lottering and myself) led specific sessions on gearing up for Solvency II and speeding product development through adaptive rules-based systems. The main themes that played throughout many of the sessions included: change is here, focusing on customers, the current economic crisis has been challenging and the industry needs to get back to the basics and simplify - simplify - simplify. Additionally, it is clear that the UK Life & Pension markets will be going through some major changes as new RDR regulation related to advisor fees and commission and automatic enrollment are rolled out in 2012 Roger A.Soppe, CLU, LUTCF, is the Senior Director of Insurance Strategy, Oracle Insurance.

    Read the article

  • Prolific USB-to-Serial Comm Port significantly slower under Windows 7 comparing to Windows XP

    - by Dmitry S
    I am using a Prolific USB-to-Serial adapter based on the Prolific chip to use with a device on serial port. I have the latest version of the driver installed: 1.3.0 (2010-7-15). When I use my device with this adapter on my main Windows 7 (32bit) system it takes 8-9 seconds to send a command through to the device. However, when I do the same thing on a different Windows XP system (an old laptop I borrowed for testing) it only takes 2-3 seconds. I have made sure that the port settings and other variables are the same between systems. I also tested on a third laptop (also running Windows 7) and again got a significant delay. So the question is if anyone else experienced the same problem and found a solution. I would like to avoid moving to an XP system for what I need to achieve so that's my last option.

    Read the article

  • First time application where to start?

    - by Nazariy
    After many years of searches and copy pasting, I'm still looking for simple solution that can transliterate text input on the fly from one key set to another. There are quite few online services that provide this feature but it still quite annoying to go online all the time. Unfortunately there is not that many applications left which are capable of doing so, and none of them supported by this day. I decided to make my own and at same time to learn something new for my self. The idea is quite simple: application should sit in system tray and wait until input language get changed, for example to Russian. If Russian language is activated, application should start to listen for user key strokes combination and replace them based on custom dictionary for example R = ?, SH = ? etc. I should be able to bind application to any installed language (Russian, Ukrainian, Bulgarian, Belarusian etc.) and customise dictionary for any of them. So my question is: Which language should I chose for this task C++, C# or might be something hardcore like Assembler, as application should work natively with Windows XP/Vista/7 or possibly Mac. (cross platform support is good but my main target is Windows) Due to nature of application behaviour how can I tell anti-virus software that it is not a "Key Logger" and basically not a virus? Where should I start and what should I be aware of? P.S. My current programming knowledge is quite basic, PHP and JavaScript with Object Oriented approach.

    Read the article

  • How can I get 32-bit Direct3D working on my 64-bit Windows 7 system?

    - by Daniel Stutzbach
    I recently upgraded a Dell Inspiron 6400 to Windows 7 Ultimate 64-bit. I have a 32-bit 3D application that refuses to run, giving an error of "Failed to initialise [sic] Direct3D device". The dxdiag tool tells me: DirectDraw Acceleration: Enabled Direct3D Acceleration: Not Available However, the 64-bit version of dxdiag tells me: DirectDraw Acceleration: Enabled Direct3D: Enabled I have installed and re-installed the latest graphics drivers, as well as the DirectX 9 redistributable, but it stills fails in the same way. dxdiag reports the chipset name as the "Mobile Intel(R) 945 Express Chipset Family" with the the Chip Type as "Intel(R) GMA 950". The main driver is igdumd64.dll, version 8.15.10.1930. How can I get 32-bit Direct3D working?

    Read the article

  • Why i can not load a simple pixel shader effect (. fx) file in xna?

    - by Mehdi Bugnard
    I just want to load a simple *.fx file into my project to make a (pixel shader) effect. But whenever I try to compile my project, I get the following error in visual studio Error List: Errors compiling .. ID3DXEffectCompiler: There were no techniques ID3DXEffectCompiler: Compilation failed I already searched on google and found many people with the same problem. And I realized that it was a problem of encoding. With the return lines unrecognized '\ n' . I tried to copy and paste to notepad and save as with ASCII or UTF8 encoding. But the result is always the same. Do you have an idea please ? Thanks a looot :-) Here is my [.fx] file : sampler BaseTexture : register(s0); sampler MaskTexture : register(s1) { addressU = Clamp; addressV = Clamp; }; //All of these variables are pixel values //Feel free to replace with float2 variables float MaskLocationX; float MaskLocationY; float MaskWidth; float MaskHeight; float BaseTextureLocationX; //This is where your texture is to be drawn float BaseTextureLocationY; //texCoord is different, it is the current pixel float BaseTextureWidth; float BaseTextureHeight; float4 main(float2 texCoord : TEXCOORD0) : COLOR0 { //We need to calculate where in terms of percentage to sample from the MaskTexture float maskPixelX = texCoord.x * BaseTextureWidth + BaseTextureLocationX; float maskPixelY = texCoord.y * BaseTextureHeight + BaseTextureLocationY; float2 maskCoord = float2((maskPixelX - MaskLocationX) / MaskWidth, (maskPixelY - MaskLocationY) / MaskHeight); float4 bitMask = tex2D(MaskTexture, maskCoord); float4 tex = tex2D(BaseTexture, texCoord); //It is a good idea to avoid conditional statements in a pixel shader if you can use math instead. return tex * (bitMask.a); //Alternate calculation to invert the mask, you could make this a parameter too if you wanted //return tex * (1.0 - bitMask.a); }

    Read the article

  • Web server (IIS) and database mirroring (Postgresql)

    - by Timka
    Recently our web-server crashed and we had to recover everything from a backup which took the whole day(totally unacceptable in our business). So my question is, how can I create a complete mirror of the server that I can use (switch dns to) in case the same disaster happens in the future? Our main server is on Amazon with Windows 2008/IIS + Postgresql 9.1. I was thinking on creating the same server on a different location as a complete mirror with the database replication. But I'm not sure how to implement IIS instance mirroring over the internet... So my question is, how can I create a complete mirror of the server that I can use (switch dns to) in case the same disaster happens in the future?

    Read the article

  • Apache Options -Indexes give me 404 instead of 403, why?

    - by netmano
    I have an Apache/2.2.21 (Debian) webserver, which I disabled directory listing with Options -Indexes but now I got 404 error for a directory, but I think I should get a 403. I have no idea why I get 404, rather than 403. What should I check? I have disabled autoindex module, after it I got a 404 for every URL that request a directory listing (eg: www.somesite.com/dir ). How can I get a 403 for this. (The dir does exist) As a try I also put Options -Index in the end of main config file (apache2.conf).

    Read the article

  • Virtualhost entries gets over-written when apache httpd.conf is rebuilt

    - by Amitabh
    Background: We have been trying to get a wildcard SSL working on multiple sub domains on a single dedicated address.. We have two sub domains next.my-personal-website.com and blog.my-personal-website.com Part of our strategy has been to edit the httpd.conf and add the NameVirtualHost xx.xx.144.72:443 directive and the virtualhost entries for port 443 for the subdomains there. This works good if we just edit the httpd.conf, add the entries, save it and restart the apache. The problem: But if we add a new sub domain from cpanel or we run the # /usr/local/cpanel/bin/apache_conf_distiller --update # /scripts/rebuildhttpdconf the virtualhost entries that we added manually are no more there in the newly generated httpd.conf file. Only the virtualhost entry for the main domain for port 443 that was there before we made edits to the httpd.conf is there(assuming we are not discussing virtualhost entries for port 80). I understand we need to put the new virtualhost entries in some include files as mentioned here in the cpanel documentation. But am not sure where to. So the question would be where do I put the NameVirtualHost xx.xx.144.72:443 directive and the two virtualhost directive for port 443, so that they are not overwritten when httpd.conf is rebuilt/regenerated later. Virtualhost entries: The two virtualhost entries for the subdomains are: <VirtualHost xx.xx.144.72:443> ServerName next.my-personal-website.com ServerAlias www.next.my-personal-website.com DocumentRoot /home/myguardi/public_html/next.my-personal-website.com ServerAdmin [email protected] UseCanonicalName On CustomLog /usr/local/apache/domlogs/next.my-personal-website.com combined CustomLog /usr/local/apache/domlogs/next.my-personal-website.com-bytes_log "%{%s}t %I .\n%{%s}t %O ." ## User myguardi # Needed for Cpanel::ApacheConf <IfModule mod_suphp.c> suPHP_UserGroup myguardi myguardi </IfModule> <IfModule !mod_disable_suexec.c> SuexecUserGroup myguardi myguardi </IfModule> ScriptAlias /cgi-bin/ /home/myguardi/public_html/next.my-personal-website.com/cgi-bin/ SSLEngine on SSLCertificateFile /etc/ssl/certs/my-personal-website.com.crt SSLCertificateKeyFile /etc/ssl/private/my-personal-website.com.key SSLCACertificateFile /etc/ssl/certs/my-personal-website.com.cabundle CustomLog /usr/local/apache/domlogs/next.my-personal-website.com-ssl_log combined SetEnvIf User-Agent ".*MSIE.*" nokeepalive ssl-unclean-shutdown <Directory "/home/myguardi/public_html/cgi-bin"> SSLOptions +StdEnvVars </Directory> and <VirtualHost xx.xx.144.72:443> ServerName blog.my-personal-website.com ServerAlias www.blog.my-personal-website.com DocumentRoot /home/myguardi/public_html/blog.my-personal-website.com ServerAdmin [email protected] UseCanonicalName On CustomLog /usr/local/apache/domlogs/blog.my-personal-website.com combined CustomLog /usr/local/apache/domlogs/blog.my-personal-website.com-bytes_log "%{%s}t %I .\n%{%s}t %O ." ## User myguardi # Needed for Cpanel::ApacheConf <IfModule mod_suphp.c> suPHP_UserGroup myguardi myguardi </IfModule> <IfModule !mod_disable_suexec.c> SuexecUserGroup myguardi myguardi </IfModule> ScriptAlias /cgi-bin/ /home/myguardi/public_html/blog.my-personal-website.com/cgi-bin/ SSLEngine on SSLCertificateFile /etc/ssl/certs/my-personal-website.com.crt SSLCertificateKeyFile /etc/ssl/private/my-personal-website.com.key SSLCACertificateFile /etc/ssl/certs/my-personal-website.com.cabundle CustomLog /usr/local/apache/domlogs/blog.my-personal-website.com-ssl_log combined SetEnvIf User-Agent ".*MSIE.*" nokeepalive ssl-unclean-shutdown <Directory "/home/myguardi/public_html/cgi-bin"> SSLOptions +StdEnvVars </Directory> and the automatically generated virtualhost entry for the main domain for port 443 is <VirtualHost xx.xx.144.72:443> ServerName my-personal-website.com ServerAlias www.my-personal-website.com DocumentRoot /home/myguardi/public_html ServerAdmin [email protected] UseCanonicalName Off CustomLog /usr/local/apache/domlogs/my-personal-website.com combined CustomLog /usr/local/apache/domlogs/my-personal-website.com-bytes_log "%{%s}t %I .\n%{%s}t %O ." ## User myguardi # Needed for Cpanel::ApacheConf <IfModule mod_suphp.c> suPHP_UserGroup myguardi myguardi </IfModule> <IfModule !mod_disable_suexec.c> SuexecUserGroup myguardi myguardi </IfModule> ScriptAlias /cgi-bin/ /home/myguardi/public_html/cgi-bin/ SSLEngine on SSLCertificateFile /etc/ssl/certs/my-personal-website.com.crt SSLCertificateKeyFile /etc/ssl/private/my-personal-website.com.key SSLCACertificateFile /etc/ssl/certs/my-personal-website.com.cabundle CustomLog /usr/local/apache/domlogs/my-personal-website.com-ssl_log combined SetEnvIf User-Agent ".*MSIE.*" nokeepalive ssl-unclean-shutdown <Directory "/home/myguardi/public_html/cgi-bin"> SSLOptions +StdEnvVars </Directory> # To customize this VirtualHost use an include file at the following location # Include "/usr/local/apache/conf/userdata/ssl/2/myguardi/my-personal-website.com/*.conf" I really appreciate if somebody can tell me how to proceed on this. Thank you. Update: Include directives present are: `Include "/usr/local/apache/conf/includes/pre_main_global.conf" Include "/usr/local/apache/conf/includes/pre_main_2.conf" Include "/usr/local/apache/conf/php.conf" Include "/usr/local/apache/conf/includes/errordocument.conf" Include "/usr/local/apache/conf/modsec2.conf" Include "/usr/local/apache/conf/includes/pre_virtualhost_global.conf" Include "/usr/local/apache/conf/includes/pre_virtualhost_2.conf" ` These are the entries that are generated before any virtualhost entry is defined. Towards the end of the httpd.conf file , the following two entries are added Include "/usr/local/apache/conf/includes/post_virtualhost_global.conf" Include "/usr/local/apache/conf/includes/post_virtualhost_2.conf" The older httpd.conf file before we added the virtualhost entries for sub domains for port 443 can be viewed here

    Read the article

  • Performance difference between MacBook Pro (2.8 GHz) vs Air (1.7 GHz)?

    - by jonathanconway
    I'm comparing these two Apple laptops: MacBook Pro (13", 2011 model): 2.8GHz dual-core Intel Core i7 processor with 4MB shared L3 cache 4GB (two 2GB SO-DIMMs) of 1333MHz DDR3 SDRAM AMD Radeon HD 6770M graphics processor with 1GB of GDDR5 memory on 2.4GHz configuration MacBook Air (13", 2011 model): 1.7GHz dual-core Intel Core i5 with 3MB shared L3 cache 4GB of 1333MHz DDR3 onboard memory Intel HD Graphics 3000 processor with 384MB of DDR3 SDRAM shared with main memory There's definitely a gap between them in terms of CPU speed and graphics, but what practical difference would this make on a day-to-day basis? On the one hand, I love the sleek, thin appearance of the Air. On the other hand, I don't want a machine that's going to be dog-slow when doing tasks such as running Virtual Machines, dual-booting to Windows and running multiple instances of Visual Studio, and maybe some light gaming. Is there going to be a major difference that makes the MacBook Pro a more attractive purchase?

    Read the article

< Previous Page | 384 385 386 387 388 389 390 391 392 393 394 395  | Next Page >