Search Results

Search found 2015 results on 81 pages for 'paul lefebvre'.

Page 9/81 | < Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >

  • Oracle Buys BigMachines - Adds Leading Configure, Price and Quote (CPQ) Cloud to the Oracle Cloud to Enable Smarter Selling

    - by Richard Lefebvre
    News Facts Oracle today announced that it has entered into an agreement to acquire BigMachines, a leading cloud-based Configure, Price and Quote (CPQ) solution provider. BigMachines’ CPQ Cloud accelerates the conversion of sales opportunities into revenue by automating the sales order process with guided selling, dynamic pricing, and an easy-to-use workflow approval process, accessible anywhere, on any device. Companies that use sales automation technology often rely on manual, cumbersome and disconnected processes to convert opportunities into orders. This creates errors, adds costs, delays revenue, and degrades the customer experience. BigMachines’ CPQ cloud extends sales automation to include the creation of an optimal quote, which enables sales personnel to easily configure and price complex products, select the best options, promotions and deal terms, and include up sell and renewals, all using automated workflows. In combination with Oracle’s enterprise-grade cloud solutions, including Marketing, Sales, Social, Commerce and Service Clouds, Oracle and BigMachines will create an end-to-end smarter selling cloud solution so sales personnel are more productive, customers are more satisfied, and companies grow revenue faster. More information on this announcement can be found at http://www.oracle.com/bigmachines Supporting Quotes “The fundamental goals of smarter selling are to provide sales teams with the information, access, and insights they need to maximize revenue opportunities and execute on all phases of the sales cycle,” said Thomas Kurian, Executive Vice President, Oracle Development. “By adding BigMachines’ CPQ Cloud to the Oracle Cloud, companies will be able to drive more revenue and increase customer satisfaction with a seamlessly integrated process across marketing and sales, pricing and quoting, and fulfillment and service.” “BigMachines has developed leading CPQ solutions that serve companies of all sizes across multiple industries,” said David Bonnette, BigMachines’ CEO. “Together with Oracle, we expect to provide a complete cloud solution to manage sales processes and deliver exceptional customer experiences.” Supporting Resources About Oracle and BigMachines General Presentation Customer and Partner Letter FAQ

    Read the article

  • Cognizant: commited in Oracle Fusion Applications and Oracle Cloud

    - by Richard Lefebvre
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Cognizant is a Global System Integrator strongly committed in Oracle Fusion Applications and Oracle Cloud, offering fixed scope implementation. In this short video, you can learn more about Cognizant strategy, experience and offerings Cognizant is Platinum Partner specialized in several Oracle Fusion Cloud Service areas /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";}

    Read the article

  • The Oracle Architects Training: 40 training sessions for our EMEA partners to build their Oracle Applications and Technical skills

    - by Richard Lefebvre
    There is a lot more to Oracle technology than meets the eye. Sure, you already belong to a small circle of our most experienced and committed partners. But are you making the best use possible of our technology solutions? Put it to the test. Join the “Oracle Partner Architects Training”. It is aimed at providing your experts, architects and consultants with in-depth architectural knowledge about Oracle technology. Here is your chance to learn from the best. Seasoned speakers, exclusive content and no product marketing. Oracle technology beyond the obvious. Mark your calendar The Oracle Partner Architects Training is an online training program. Sign up for the live Webex sessions (scheduled from January 2013 till April 2013) or watch replays as they become available. Feel free to follow training sessions at your own pace. Also, last year’s sessions are still very accurate and very available on architects.oraevents.eu NOTE: Looking to get your consultants Oracle certified? One more reason to join the Oracle Partner Architects Training. It is the fast track to getting their expertise validated with an Oracle certificate.

    Read the article

  • Oracle Partner Specialists – Sell & Deliver High Value Products to Customers

    - by Richard Lefebvre
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Do you want to know where to find useful information about partner training and other activities to complete Oracle Specialization available in the country you are personally based? Go to the EMEA partner enablement blog and read latest information regarding training opportunities ready to join for Cloud Services, Applications, Business Intelligence, Middleware, Database 12c, Engineered System as well as Server & Storage. Recently, we announced new TestFest events in France, which you can join to pass your own Implementation Assessment within the Specialization category you have already chosen. To find out where and when the next TestFest close to your location will take place, please contact [email protected] or watch out for further announcements of TestFest events in your home country. Turnback to the EMEA Partner Enablement Blog from time to time to update your own Specialization and join the latest training for Sales, Presales or Implementation Specialists:  https://blogs.oracle.com/opnenablement/

    Read the article

  • Creating Ideal Customers with Modern Marketing

    - by Richard Lefebvre
    “Without that real-time perspective, it's just not possible to stay in step with what your customers want and need.” — Customer-Obsessed Marketing Is Your Next Competitive Edge Every business talks about focusing on the customer. But few actually deliver. Why? Because digital marketing technology can’t tell a compelling story. It lacks engaging dialogue with no connection beyond the transaction. It’s lost in translation because marketers don’t speak code. And it’s confusing to the customer because marketing and IT can’t connect process and data. Take a look at your digital marketing picture. From a distance it may look fine. But look up close. It’s fragmented and the dots are not connected. You need much higher resolution. Step back and see the big picture. Zoom in on the individual customer. But you’ll need Modern Marketing technology engineered with enterprise grade data management and proven cloud performance. Explore the people, processes, and technology of the Oracle Marketing Cloud. Create a culture of customer obsession. Simplify marketing across all channels to turn casual prospects into passionate advocates. Engage ideal customers with a meaningful experience. Personalize your brand narrative for each customer in every chapter of your story to increase engagement and revenue. Read the full article and watch the videos here

    Read the article

  • Customer Concepts Magazine issue 7

    - by Richard Lefebvre
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Why should you integrate social into your key sales, marketing, commerce and service processes for a great customer experience? Find out how to get social success from the latest edition of Oracle Customer Concepts Magazine here

    Read the article

  • New in Production: Fusion CRM Implementation Specialist Exams!

    - by Richard Lefebvre
    Oracle PartnerNetwork Specialized program is releasing new certifications on our latest products, and partners are invited to be the first candidates. Oracle Fusion Customer Relationship Management 11g Sales Essentials Exam (1Z0-456) – now in Production! ·               All Beta exam participants will receive their exam scores as of September 24, 2012. ·               The successful candidates will receive their certificates starting mid-October, 2012. Contact Us Please direct any inquiries you may have to Oracle Partner Enablement team at [email protected].

    Read the article

  • Oracle CLOUDWORLD Global Event Series: Registration now open

    - by Richard Lefebvre
    Showcasing our ongoing commitment to cloud, Oracle will present an all-new global event series, Oracle CloudWorld! Our senior executives, customers, partners, and industry thought leaders will share how to drive business transformation using the Oracle Cloud. Highlighting Oracle's PaaS, SaaS, and Social solutions. Dubai, United Arab Emirates January 15 Register Now Munich, Germany April Notify me when registration opens  Register Now-- London, England April Notify me when registration opens See the dedicated Oracle CloudWorld website for information and registration details. Go to the CloudWorld site. Read the Oracle press release

    Read the article

  • Oracle RightNow Cloud Service Roadmap - Live Webcast, Nov 13, 6pm CET

    - by Richard Lefebvre
    Did you miss out on Oracle OpenWorld this year? Then make sure you don’t miss out on this Webinar.  The Oracle RightNow development team shares the latest innovations and integrations, including future roadmap, for the Oracle RightNow Service Cloud platform.  Find out how these innovations will help you deliver exceptional customer service to your customers.Join our live Webcast on Wednesday, November 13, 2013, 09:00 a.m. PT / 12:00 p.m. ET (18:00 p.m. CET) to get up to speed with the latest updates and future capabilities, and learn how you can: Provide a more engaging Web experience Increase the effectiveness of assisted service Deliver the right answer at the right time - all the time Agenda Topics How RightNow fits into Oracle's vision for Oracle Service The latest updates to the Oracle RightNow Cloud Service platform Oracle's key investment areas and roadmap for Oracle RightNow Cloud Service Don’t miss this chance to learn how you can delight your customers while improving cost and efficiency.Register Now

    Read the article

  • Oracle Voice, the Virtual Assistant for Sales Reps

    - by Richard Lefebvre
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Wish there was a Siri-like virtual assistant for sales reps? The Oracle Voice for Sales Cloud application is now available in the iTunes Store. Selling from your iPhone has never been this fast, friendly & fun! See Oracle Voice for Sales Cloud in action. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";}

    Read the article

  • Steltix (NL) is live on Oracle Sales Cloud

    - by Richard Lefebvre
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Steltix (NL) uses Oracle Sales Cloud (Oracle Fusion CRM in the Oracle Cloud) to improve the business performance of customers and to reduce costs and minimize risks. If you read Dutch, I encourage you reading the press release here!

    Read the article

  • Securing smtp with login

    - by Paul Peelen
    I have a ispconfig server, and it seems that someone is using it to send spam. I got about 130 "Mail Delivery System" email about declined send email. This spammer uses my email address as sent from adress, so I get all these email adresses to my mail. I am using Postfix and Courier. I installed my server according to this guide: http://www.howtoforge.com/perfect-server-debian-lenny-ispconfig3-p3 I did this a few months ago. My question: Can I secure my server to require login to be able to send email, and if so... how? Thanks! EDIT Some data from mail.log, these kind of error show up constantly: Jun 15 17:58:16 bolt postfix/qmgr[10712]: CC7DA1242AE: from=<paul@*****.se>, size=3782, nrcpt=1 (queue active) Jun 15 17:58:16 bolt postfix/smtp[11337]: CC7DA1242AE: to=<[email protected]>, relay=none, delay=4641, delays=4640/0.01/0.32/0, dsn=4.4.3, status=deferred (Host or domain name not found. Name service error for name=cmlisboa.pt type=MX: Host not found, try again) Jun 15 17:58:19 bolt postfix/smtpd[10836]: connect from static-200-105-220-154.acelerate.net[200.105.220.154] Jun 15 17:58:20 bolt postfix/smtpd[10836]: NOQUEUE: reject: RCPT from static-200-105-220-154.acelerate.net[200.105.220.154]: 550 5.1.1 <advertising@*****.com>: Recipient address rejected: User unknown in virtual mailbox table; from=<[email protected]> to=<advertising@*****.com> proto=ESMTP helo=<static-200-105-220-154.acelerate.net> Jun 15 17:58:20 bolt postfix/smtpd[10836]: lost connection after DATA (0 bytes) from static-200-105-220-154.acelerate.net[200.105.220.154] Jun 15 17:58:20 bolt postfix/smtpd[10836]: disconnect from static-200-105-220-154.acelerate.net[200.105.220.154] Jun 15 17:58:29 bolt postfix/smtpd[10834]: connect from unknown[62.176.172.226] Jun 15 17:58:32 bolt postfix/smtpd[10834]: 386791241F9: client=unknown[62.176.172.226] Jun 15 17:58:34 bolt postfix/cleanup[10975]: 386791241F9: message-id=<[email protected]> Jun 15 17:58:34 bolt postfix/qmgr[10712]: 386791241F9: from=<[email protected]>, size=867, nrcpt=1 (queue active) Jun 15 17:58:35 bolt postfix/smtpd[10834]: disconnect from unknown[62.176.172.226] Jun 15 17:58:35 bolt amavis[11084]: (11084-17) Blocked SPAM, [62.176.172.226] [62.176.172.226] <[email protected]> -> <*****@*****>, Message-ID: <[email protected]>, mail_id: XczovKoMBYNr, Hits: 18.471, size: 867, 833 ms Jun 15 17:58:35 bolt postfix/smtp[10732]: 386791241F9: to=<*****@*****>, relay=127.0.0.1[127.0.0.1]:10024, delay=3.5, delays=2.7/0/0/0.83, dsn=2.7.0, status=sent (250 2.7.0 Ok, discarded, id=11084-17 - SPAM) Jun 15 17:58:35 bolt postfix/qmgr[10712]: 386791241F9: removed Jun 15 17:58:43 bolt postfix/smtpd[10836]: warning: 178.121.154.194: address not listed for hostname mm-194-154-121-178.dynamic.pppoe.mgts.by Jun 15 17:58:43 bolt postfix/smtpd[10836]: connect from unknown[178.121.154.194] Jun 15 17:58:45 bolt postfix/smtpd[10727]: connect from unknown[180.134.223.86] EDIT #2 Got some more info from the logs, this is a send request: mail.info.1:Jun 15 16:41:57 bolt amavis[5399]: (05399-06) Passed CLEAN, [110.139.48.64] [110.139.48.64] <paul@*****.se> -> <[email protected]>, Message-ID: <CHILKAT-MID-7c54ebcf-5501-de9b-f0b1-4f0234290d8d@HP-IRISH>, mail_id: 35l56Ramx6Nc, Hits: -2.941, size: 3329, queued_as: 2485770086, 136 ms mail.info.1:Jun 15 16:41:57 bolt postfix/smtp[4743]: 375C570082: to=<[email protected]>, relay=127.0.0.1[127.0.0.1]:10024, delay=4.8, delays=4.7/0/0/0.14, dsn=2.0.0, status=sent (250 2.0.0 Ok, id=05399-06, from MTA([127.0.0.1]:10025): 250 2.0.0 Ok: queued as 2485770086) Which apparently got thrue. Any ideas how to restrict this?

    Read the article

  • Cardinality Estimation Bug with Lookups in SQL Server 2008 onward

    - by Paul White
    Cost-based optimization stands or falls on the quality of cardinality estimates (expected row counts).  If the optimizer has incorrect information to start with, it is quite unlikely to produce good quality execution plans except by chance.  There are many ways we can provide good starting information to the optimizer, and even more ways for cardinality estimation to go wrong.  Good database people know this, and work hard to write optimizer-friendly queries with a schema and metadata (e.g. statistics) that reduce the chances of poor cardinality estimation producing a sub-optimal plan.  Today, I am going to look at a case where poor cardinality estimation is Microsoft’s fault, and not yours. SQL Server 2005 SELECT th.ProductID, th.TransactionID, th.TransactionDate FROM Production.TransactionHistory AS th WHERE th.ProductID = 1 AND th.TransactionDate BETWEEN '20030901' AND '20031231'; The query plan on SQL Server 2005 is as follows (if you are using a more recent version of AdventureWorks, you will need to change the year on the date range from 2003 to 2007): There is an Index Seek on ProductID = 1, followed by a Key Lookup to find the Transaction Date for each row, and finally a Filter to restrict the results to only those rows where Transaction Date falls in the range specified.  The cardinality estimate of 45 rows at the Index Seek is exactly correct.  The table is not very large, there are up-to-date statistics associated with the index, so this is as expected. The estimate for the Key Lookup is also exactly right.  Each lookup into the Clustered Index to find the Transaction Date is guaranteed to return exactly one row.  The plan shows that the Key Lookup is expected to be executed 45 times.  The estimate for the Inner Join output is also correct – 45 rows from the seek joining to one row each time, gives 45 rows as output. The Filter estimate is also very good: the optimizer estimates 16.9951 rows will match the specified range of transaction dates.  Eleven rows are produced by this query, but that small difference is quite normal and certainly nothing to worry about here.  All good so far. SQL Server 2008 onward The same query executed against an identical copy of AdventureWorks on SQL Server 2008 produces a different execution plan: The optimizer has pushed the Filter conditions seen in the 2005 plan down to the Key Lookup.  This is a good optimization – it makes sense to filter rows out as early as possible.  Unfortunately, it has made a bit of a mess of the cardinality estimates. The post-Filter estimate of 16.9951 rows seen in the 2005 plan has moved with the predicate on Transaction Date.  Instead of estimating one row, the plan now suggests that 16.9951 rows will be produced by each clustered index lookup – clearly not right!  This misinformation also confuses SQL Sentry Plan Explorer: Plan Explorer shows 765 rows expected from the Key Lookup (it multiplies a rounded estimate of 17 rows by 45 expected executions to give 765 rows total). Workarounds One workaround is to provide a covering non-clustered index (avoiding the lookup avoids the problem of course): CREATE INDEX nc1 ON Production.TransactionHistory (ProductID) INCLUDE (TransactionDate); With the Transaction Date filter applied as a residual predicate in the same operator as the seek, the estimate is again as expected: We could also force the use of the ultimate covering index (the clustered one): SELECT th.ProductID, th.TransactionID, th.TransactionDate FROM Production.TransactionHistory AS th WITH (INDEX(1)) WHERE th.ProductID = 1 AND th.TransactionDate BETWEEN '20030901' AND '20031231'; Summary Providing a covering non-clustered index for all possible queries is not always practical, and scanning the clustered index will rarely be optimal.  Nevertheless, these are the best workarounds we have today. In the meantime, watch out for poor cardinality estimates when a predicate is applied as part of a lookup. The worst thing is that the estimate after the lookup join in the 2008+ plans is wrong.  It’s not hopelessly wrong in this particular case (45 versus 16.9951 is not the end of the world) but it easily can be much worse, and there’s not much you can do about it.  Any decisions made by the optimizer after such a lookup could be based on very wrong information – which can only be bad news. If you think this situation should be improved, please vote for this Connect item. © 2012 Paul White – All Rights Reserved twitter: @SQL_Kiwi email: [email protected]

    Read the article

  • BIOS flash XP, 1 long beep, 2 short beeps, over&over

    - by Paul
    BIOS issue on HP dv9233cl laptop, wiped drive of Vista, loaded XP, not all the drives loaded. Went to the HP website, downloaded all drivers for this laptop. Started loading them. Loaded WIN Flash HP Network System BIOS Window SP42187. After a minute a low resolution screen appeared stating "It is now safe to turn off the computer" I waited a minute and half. Turned it off. Let it set 10 seconds try to start and No screen images at all and a nasty loud long beep 2 short beeps, 2 seconds of silence and it happens over & over again. I have unplugged/removed battery, still same problem, Any sugg.... Thx.. Paul

    Read the article

  • LTO 3 tape drive needing repaired

    - by DO it all Paul
    We have an IBM LTO 3 tape drive that needs repaired and with the £400 price tag i'm having to shop around for quotes. My question is has anyone actually repaired one before and how was in done? The first error LED was showing a 6, then i cleared the mangled tape only for it to start flashing alternate 'o' on the 7 segment display, simliar to a half 8, flashing top to bottom and it would just flash away like that coupled with a flashing amber light. I tried a reset holding the eject button for it to show an 'r' the go back to flashing again as before. I checked the IBM solutions for the codes but this flashing isn't documented at all. Would be great if anyone had any experience in this area. Thank you, Paul

    Read the article

  • HTTP downloads slow - FTP of same file very fast - Windows 2003

    - by Paul Hinett
    I am having some issues with download speeds on my site via http, i am averaging around 70kbps downloading a file that is around 70mb. But if i connect to my server via FTP and download the same file on the same computer / connection i am averaging about 300+kbps. I know my server has alot of connections at any one time, probably around 400 connections. My server has a 1gbps connection to the internet so there is plenty of bandwidth available, as proven with the FTP. I have no throttling of any kind enabled in IIS. If interested there is a test file here you can download to check the speed: http://filesd.house-mixes.com/test.zip I am based in the UK and the server is in Washington, USA if that makes any difference. Paul

    Read the article

  • Setting up routing for MS DirectAccess to a VMWare EsXi Host

    - by Paul D'Ambra
    I'm trying to set up DirectAccess on a virtual machine so I can demonstrate it's value and then if need be add a physical machine to host it. I'm hitting a problem because the Direct Access machine (DA01) needs to have 2 public addresses actually configured on the external adapter but there is a Zyxel Zywall USG300 between the VMware ESXi host and the outside world. I've summarised my setup in this diagram If I ping from the LAN to 212.x.y.89 I get a response but if I ping from the VM I get destination host unreachable. I used "route add 212.x.y.89 192.c.d.1" and get request timed out. At that point I see outbound traffic allowed on the Zyxel firewall but nothing coming back. I'm past my understanding of routing and VMWare so am not sure how to tie down where my problem lies (or even if this setup is possible). So any help massively appreciated. Paul

    Read the article

  • KVM Guest not reachable from host

    - by Paul
    Hello, I'm running Ubuntu server 9.10, installed KVM etc. Created the bridge network following instructions on help.ubuntu.com/community/KVM/Networking Created a windows 2008 guest using virt-install command line (using virt-manager GUI from a remote Ubuntu desktop would not let me select the ISO location). I can however use a remote virt-manager to connect to the guest and complete the windows install. Within windows 2008 I changed the IP address but cannot ping from outside world. The bridge network appears fine - I'm not sure what else to look at! Here is the interfaces file: The loopback network interface auto lo iface lo inet loopback The primary network interface auto eth0 iface eth0 inet manual # auto br0 iface br0 inet static address 60.234.64.50 netmask 255.255.255.248 network 60.234.0.0 broadcast 60.234.0.255 gateway 60.234.64.49 bridge_ports eth0 bridge_stp off bridge_fd 0 bridge_maxwait 0 auto eth1 iface eth1 inet static address 192.168.12.2 netmask 255.255.255.0 broadcast 192.168.12.255 The ip of the windows server is 60.234.64.52 What else should I check? Regards Paul.

    Read the article

  • Silverlight Version 4 latest build for Win7 64bit and WinXP 32bit

    - by Paul
    I have a requirement where a few people need the latest version of Silverlight 4 installed. I know the latest version is 5.xx... but apparently with some new software we're having installed we have to use version 4 After a bit of googling i can see that the latest version is... Build 4.1.10329.0 Released May 8, 2012 We have a mix of Win7 64-bit machines and WinXP 32-bit machines. Q: Is there a different version for each OS or the same one fits all. (This seems strangely hard to decipher by googling) Q: Does anyone know where i can download the latest version 4? Microsoft do not seem to offer it anymore unless i'm just not finding it. Q: Is there a separate browser version of it or will installing it also handle any browser needs (our new software will be browser based) Any pointers much appreciated. Paul

    Read the article

  • Connecting my iPhone to iTunes causes my Acer laptop to crash

    - by Paul Sheldrake
    Hello I have an Acer Travelmate 8200 laptop and whenever I connect my Iphone to it, it crashes with the BSOD(Blue Screen Of Death). I have figured out that if I delete all the pictures in my phone I can get it to connect but that is not a ideal long term solution. I also read that it may be a conflict with the built in web-cam I have but I've upgraded the driver and I still get the crashing problem. Any suggestions would be appreciated! Thanks Paul! edit: Here is the BSOD message I get

    Read the article

  • No HDMI audio - Windows 8 - ASUS H81M-PLUS

    - by Paul Wright
    I have an issue with HDMI audio on Windows 8 using an ASUS H81M-PLUS motherboard (without an external GFX card). There are many forum posts advising you to go into playback devices and setting HDMI to be default - I have done this. To eliminate what works and what doesn't work: I have not been able to get sound from my HDTV using HDMI. I have used this HDMI cable with my PS3, so this cable should be fine. I am able to use the HDMI cable in extended mode, so that I have two monitors (including the TV), just no audio. This HDMI cable goes straight from the motherboard to the TV. Below I have included 'Device manager', and 'Playback Devices' (Sound). Device Manager Playback Devices, showing disabled and disconnected devices I am at a loss. I have uninstalled all drivers, and then rebooted and made windows look for the correct ones, made sure the HDMI device was default. Thanks, Paul

    Read the article

  • i accidentally deleted the recovery folder on a partition (win vista home)

    - by paul
    i accidentally deleted the recovery folder on the recovery partition (win vista home) i think it was some sort of scheduled maintenance of some program that i did not configure properly? oops... lol i called toshiba and they said i needed to buy a recovery program, which i didnt bother doing. I bought a legal copy of vista and would like to install the correct files and in a way that when my computer starts looking for files it will eventually find them or i can point to the partition. i'm pretty sure it's not a matter of copy and paste (is it?) thanks Paul

    Read the article

  • Creating different margins on the first page of a word template

    - by Paul
    I have a letterhead template and I need the first page left margin to be larger than subsequent pages. I've seen the option of placing a text box or image box in the header to push the text but this ends up throwing off the tabs and bullet list indentation markers. I thought of setting up the first page using two columns and pushing the text to start on the second column but I can't seem to find a way to get the text to switch back to 1 column on the second page when it is created from text overflowing. Does anyone know how something like this is possible? Thanks in advance, Paul

    Read the article

  • Disable OS X Portable Home Directories for specific hosts for all users, not just individuals?

    - by Paul Nendick
    Would it be possible to block any and all Portable Home Directory services for specific hosts? Something like MCX's "MobileAccountNeverAsk-" but for the whole workstation? We have a network with both portable and stationary machines. I'd like our users to be able to use all machines, going portable on the MacBook but not being bothering with syncing when logged into stationary iMacs or Mac Pros. The Open Directory servers are running Snow Leopard (for now) and all clients are running Lion. Thanks! Paul

    Read the article

  • Fun with Aggregates

    - by Paul White
    There are interesting things to be learned from even the simplest queries.  For example, imagine you are given the task of writing a query to list AdventureWorks product names where the product has at least one entry in the transaction history table, but fewer than ten. One possible query to meet that specification is: SELECT p.Name FROM Production.Product AS p JOIN Production.TransactionHistory AS th ON p.ProductID = th.ProductID GROUP BY p.ProductID, p.Name HAVING COUNT_BIG(*) < 10; That query correctly returns 23 rows (execution plan and data sample shown below): The execution plan looks a bit different from the written form of the query: the base tables are accessed in reverse order, and the aggregation is performed before the join.  The general idea is to read all rows from the history table, compute the count of rows grouped by ProductID, merge join the results to the Product table on ProductID, and finally filter to only return rows where the count is less than ten. This ‘fully-optimized’ plan has an estimated cost of around 0.33 units.  The reason for the quote marks there is that this plan is not quite as optimal as it could be – surely it would make sense to push the Filter down past the join too?  To answer that, let’s look at some other ways to formulate this query.  This being SQL, there are any number of ways to write logically-equivalent query specifications, so we’ll just look at a couple of interesting ones.  The first query is an attempt to reverse-engineer T-SQL from the optimized query plan shown above.  It joins the result of pre-aggregating the history table to the Product table before filtering: SELECT p.Name FROM ( SELECT th.ProductID, cnt = COUNT_BIG(*) FROM Production.TransactionHistory AS th GROUP BY th.ProductID ) AS q1 JOIN Production.Product AS p ON p.ProductID = q1.ProductID WHERE q1.cnt < 10; Perhaps a little surprisingly, we get a slightly different execution plan: The results are the same (23 rows) but this time the Filter is pushed below the join!  The optimizer chooses nested loops for the join, because the cardinality estimate for rows passing the Filter is a bit low (estimate 1 versus 23 actual), though you can force a merge join with a hint and the Filter still appears below the join.  In yet another variation, the < 10 predicate can be ‘manually pushed’ by specifying it in a HAVING clause in the “q1” sub-query instead of in the WHERE clause as written above. The reason this predicate can be pushed past the join in this query form, but not in the original formulation is simply an optimizer limitation – it does make efforts (primarily during the simplification phase) to encourage logically-equivalent query specifications to produce the same execution plan, but the implementation is not completely comprehensive. Moving on to a second example, the following query specification results from phrasing the requirement as “list the products where there exists fewer than ten correlated rows in the history table”: SELECT p.Name FROM Production.Product AS p WHERE EXISTS ( SELECT * FROM Production.TransactionHistory AS th WHERE th.ProductID = p.ProductID HAVING COUNT_BIG(*) < 10 ); Unfortunately, this query produces an incorrect result (86 rows): The problem is that it lists products with no history rows, though the reasons are interesting.  The COUNT_BIG(*) in the EXISTS clause is a scalar aggregate (meaning there is no GROUP BY clause) and scalar aggregates always produce a value, even when the input is an empty set.  In the case of the COUNT aggregate, the result of aggregating the empty set is zero (the other standard aggregates produce a NULL).  To make the point really clear, let’s look at product 709, which happens to be one for which no history rows exist: -- Scalar aggregate SELECT COUNT_BIG(*) FROM Production.TransactionHistory AS th WHERE th.ProductID = 709;   -- Vector aggregate SELECT COUNT_BIG(*) FROM Production.TransactionHistory AS th WHERE th.ProductID = 709 GROUP BY th.ProductID; The estimated execution plans for these two statements are almost identical: You might expect the Stream Aggregate to have a Group By for the second statement, but this is not the case.  The query includes an equality comparison to a constant value (709), so all qualified rows are guaranteed to have the same value for ProductID and the Group By is optimized away. In fact there are some minor differences between the two plans (the first is auto-parameterized and qualifies for trivial plan, whereas the second is not auto-parameterized and requires cost-based optimization), but there is nothing to indicate that one is a scalar aggregate and the other is a vector aggregate.  This is something I would like to see exposed in show plan so I suggested it on Connect.  Anyway, the results of running the two queries show the difference at runtime: The scalar aggregate (no GROUP BY) returns a result of zero, whereas the vector aggregate (with a GROUP BY clause) returns nothing at all.  Returning to our EXISTS query, we could ‘fix’ it by changing the HAVING clause to reject rows where the scalar aggregate returns zero: SELECT p.Name FROM Production.Product AS p WHERE EXISTS ( SELECT * FROM Production.TransactionHistory AS th WHERE th.ProductID = p.ProductID HAVING COUNT_BIG(*) BETWEEN 1 AND 9 ); The query now returns the correct 23 rows: Unfortunately, the execution plan is less efficient now – it has an estimated cost of 0.78 compared to 0.33 for the earlier plans.  Let’s try adding a redundant GROUP BY instead of changing the HAVING clause: SELECT p.Name FROM Production.Product AS p WHERE EXISTS ( SELECT * FROM Production.TransactionHistory AS th WHERE th.ProductID = p.ProductID GROUP BY th.ProductID HAVING COUNT_BIG(*) < 10 ); Not only do we now get correct results (23 rows), this is the execution plan: I like to compare that plan to quantum physics: if you don’t find it shocking, you haven’t understood it properly :)  The simple addition of a redundant GROUP BY has resulted in the EXISTS form of the query being transformed into exactly the same optimal plan we found earlier.  What’s more, in SQL Server 2008 and later, we can replace the odd-looking GROUP BY with an explicit GROUP BY on the empty set: SELECT p.Name FROM Production.Product AS p WHERE EXISTS ( SELECT * FROM Production.TransactionHistory AS th WHERE th.ProductID = p.ProductID GROUP BY () HAVING COUNT_BIG(*) < 10 ); I offer that as an alternative because some people find it more intuitive (and it perhaps has more geek value too).  Whichever way you prefer, it’s rather satisfying to note that the result of the sub-query does not exist for a particular correlated value where a vector aggregate is used (the scalar COUNT aggregate always returns a value, even if zero, so it always ‘EXISTS’ regardless which ProductID is logically being evaluated). The following query forms also produce the optimal plan and correct results, so long as a vector aggregate is used (you can probably find more equivalent query forms): WHERE Clause SELECT p.Name FROM Production.Product AS p WHERE ( SELECT COUNT_BIG(*) FROM Production.TransactionHistory AS th WHERE th.ProductID = p.ProductID GROUP BY () ) < 10; APPLY SELECT p.Name FROM Production.Product AS p CROSS APPLY ( SELECT NULL FROM Production.TransactionHistory AS th WHERE th.ProductID = p.ProductID GROUP BY () HAVING COUNT_BIG(*) < 10 ) AS ca (dummy); FROM Clause SELECT q1.Name FROM ( SELECT p.Name, cnt = ( SELECT COUNT_BIG(*) FROM Production.TransactionHistory AS th WHERE th.ProductID = p.ProductID GROUP BY () ) FROM Production.Product AS p ) AS q1 WHERE q1.cnt < 10; This last example uses SUM(1) instead of COUNT and does not require a vector aggregate…you should be able to work out why :) SELECT q.Name FROM ( SELECT p.Name, cnt = ( SELECT SUM(1) FROM Production.TransactionHistory AS th WHERE th.ProductID = p.ProductID ) FROM Production.Product AS p ) AS q WHERE q.cnt < 10; The semantics of SQL aggregates are rather odd in places.  It definitely pays to get to know the rules, and to be careful to check whether your queries are using scalar or vector aggregates.  As we have seen, query plans do not show in which ‘mode’ an aggregate is running and getting it wrong can cause poor performance, wrong results, or both. © 2012 Paul White Twitter: @SQL_Kiwi email: [email protected]

    Read the article

< Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >