Search Results

Search found 30724 results on 1229 pages for 'backup solution'.

Page 41/1229 | < Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >

  • "Siebel2FusionCRM Integration" solution by ec4u (D)

    - by Richard Lefebvre
    ec4u, a CRM System Integration leader based in Germany and Switzerland, and an historical Oracle/Siebel partner, offers a complete "Siebel2FusionCRM Integration" solution, based on tools methodology and services. ec4u Siebel2FusionCRM Integration solution's main objectives are: Integration between Siebel (on-premise) and Fusion CRM / Marketing (“in the cloud”) Accounts, Contacts and Addresses are maintained by Sales in Siebel CRM and synchronized in real-time into Fusion CRM / Marketing CDM Processing ensures clean data for marketing campaigns (validation and deduplication) Create E-Mail marketing campaigns and newsletters in Fusion The solution features: Upsert processes figure out what information needs to be updated, inserted or terminated (deleted). However, as Siebel is the data master, it is still a one-way synchronization. Handle deleted or nullified information by terminating them in Fusion CRM (set start and end date to define the validity period) Initial load and real-time synchronization use the same processes Invocations/Operations can be repeated due to no transactional support from Fusion web services Tagging sub entries in case of 1 to N mapping (Example: Telephone number is one simple field in Siebel but in Fusion you can have multiple telephone numbers in a sub table) E-Mail-Notification in case of any error (containing error message, instance number, detailed payload) Schematron Validation Interested? Looking for more details or a partnership with ec4u for a "Siebel2FusionCRM Integration" project? Contact: Gregor Bublitz, Director Expert Services ([email protected])

    Read the article

  • Mouse wheel speed, a permanent solution?

    - by Logan
    I would like to address an issue that has been around for a while now. The wireless mouse's wheel speed is abnormally fast on Ubuntu (as well as Mac Osx, as I have read) and the way to fix this temporarily is to unplug and plug the wireless adapter. A solution for this have been asked for in various forum topics on Ubuntu forums and also askubuntu. However the best solution for this is to re-plug the wireless adapter for the mouse. This fixes the mouse wheel speed until the next reboot. My question is, what can be done to make this permanent? Can a shell script be written, or firstly what can be causing this? If you could give me some ideas on why this problem is occurring I would happily write a shell script for it... (I am thinking if this is fixed by a simple re-plug of the adapter, maybe a shell script to disable device and re-enable it or something like that... could do the trick) I appreciate any discussions and ideas on the subjects. Here are some already discussed topics on the same subject that I've researched: Mouse wheel jumpy on scrolling Mouse wheel scrolling too fast There's a lot more than that on the net, all of them ends with re-plug solution.

    Read the article

  • Which Bliki (Blog+Wiki) solution can you recommend?

    - by asmaier
    I'm searching for a good Bliki solution, meaning a combination of blog and wiki that I can install on my own web space. I would like to be able to write articles in the wiki style much like with media wiki. So I want to use a wiki markup language, have a revision history, comments, internal links to other pages (maybe in other languages) and be able to collaboratively edit the articles. On the other side I would like to have a blog-like view on my articles, showing new articles (and changes to existing articles) in a time ordered fashion. It would be nice if it would be possible to search through the articles and also tag the articles, so one could generate a tag cloud for the articles. A nice feature would also be to be able to order the articles according to views or even a voting system for the articles. Good would also be a permission system to keep certain articles private, showing them only to people logged in to the platform. Apart from these nice to have features an absolute must have feature for the Bliki platform I'm searching is the possibility to handle math equations (written in LaTeX syntax) and display them either as pictures like media wiki or even better using Mathjax. At the moment I'm using a web service called wikiDot which offers some of the mentioned features, however the free version shows to much advertisements, the blog feature is not mature, the design is quite ugly and loading of the page is often slow. So I want to install a Bliki solution on my own webspace. Can you recommend any solution for that?

    Read the article

  • Better solution for boolean mixing?

    - by Ruben Nunez
    Sorry if this question has been asked in the past, but searching Google and here didn't yield relevant results, so here goes. I'm working on a fragment shader that implements both conditional/boolean diffuse and bump mapping (that is to say, you don't need a diffuse texture or a normals texture, and if they're not present, they're simply changed to default values). My current solution is to use a uniform float to say "mix amount". For example, computing the diffuse texel works as: // Compute diffuse amount scaled by vCol // If no texture is present (mDif = 0.0), then DiffuseTexel = vCol // kT[0] is the diffuse texture // vTex is the texture co-ordinates // mDif is the uniform float containing the mix amount (either 0.0 or 1.0) vec4 DiffuseTexel = vCol*mix(vec4(1.0), texture2D(kT[0], vTex), mDif); While that works great and all, I was wondering if there's a better way of doing this, as I will never have any use for in-between values for funky effects. I know that perhaps the best solution is to simply write separate shaders for mDif=0.0 and mDif=1.0, but I'd like a more elegant solution than splicing shaders before compiling or writing multiple shader files and keeping each one updated. Any ideas are greatly appreciated. =)

    Read the article

  • Oracle Enterprise Manager 12c Testing-as-a-Service Solution

    - by user810030
    With organizations spending as much as 50 percent of their QA time with non-test related activities like setting up hardware and deploying applications and test tools, the cloud will bring obvious benefits. A key component of Oracle Enterprise Manager our current Application Quality Management products have been helping our customers with application load testing, functional testing and test process management, but also test data management, data masking and real application testing. These products enable customers to thoroughly test applications and their underlying infrastructure to help ensure the best quality, scalability and availability prior to deployment.  Today, Oracle announced Oracle Enterprise Manager 12c Testing-as-a-Service Solution . This solution will allow users to significantly decrease the time needed to setup a complete test environment, while enhancing testing efficiency. Please read the Press Release mentioned above and join us in our Enterprise Manager LinkedIn Group discussion on this topic. (need to be a member). Or visit our booth this week during the EuroSTAR Software Testing conference in Amsterdam where we can demo this solution  I hope you find this helpfull Stay Connected: Twitter |  Facebook |  YouTube |  Linkedin |  Newsletter

    Read the article

  • Combat server downtime by duplicating server and re-routing when main server is down

    - by Wasim
    I have a CentOS server which at times either crashes or gets attacked with DDOS. At the moment I have an off site backup which is filled up with 1.7TB of data. I'm currently paying as much for the backup as I am for the server and I was looking for advice from experienced people as to what option is best to proceed from here. Would it be a viable solution to ditch the offsite backup, and instead purchase an additional server which is an exact duplication of the first server. So if the first server is down, users are re-routed to the second server without noticing the first server is even down. This would create an automatic backup of the first server (albeit not offsite) and relinquish the need for the expensive offsite backup. Is the above solution a true solution to pricey backup or is offsite backup absolutely necessary? How would I go about doing this (obviously it's pretty complex so just links to some reading material or the terminology of the procedure would be great)? Appreciate the help and advice.

    Read the article

  • USDM and Oracle Offer a New Part 11 Compliant Solution for Life Sciences

    - by Michael Snow
    Guest post today provided by Oracle partner, USDM  Regulated Content in WebCenterUSDM and Oracle offer a new Part 11 compliant solution for Life Sciences (White Paper) Life science customers now have the ability to take advantage of all of the benefits of Oracle’s WebCenter Content, a global leader in Enterprise Content Management.   For the past year, USDM has been developing best practice compliance solutions to meet regulated content management requirements for 21 CFR Part 11 in WebCenter Content. USDM has been an expert in ECM for life sciences since 1999 and in 2011, certified that WebCenter was a 21CFR Part 11 compliant content management platform (White Paper).  In addition, USDM has built Validation Accelerators Packs for WebCenter to enable life science organizations to quickly and cost effectively validate this world class solution.With the Part 11 certification, Oracle’s WebCenter now provides regulated life science organizations  the ability to manage REGULATORY content in WebCenter, as well as the ability to take advantage of ALL of the additional functionality of WebCenter, including  a complete, open, and integrated portfolio of portal, web experience management, content management and social networking technology.  Here are a few screen shot examples of Part 11 functionality included in the product: E-Sign, E-Sign Rendor, Meta Data History, Audit Trail Report, and Access Reporting. Gone are the days that life science companies have to spend millions of dollars a year to implement, maintain, and validate ECM systems that no longer meet the ever changing business and regulatory requirements.  Life science companies now have the ability to use WebCenter Content, an ECM system with a substantially lower cost of ownership and unsurpassed functionality.Oracle has been #1 in life sciences because of their ability to develop cost effective, easy-to-use, scalable solutions which help increase insight and efficiency to drive growth for their customers.  Adding a world class ECM solution to this product portfolio allows life science organizations the chance to get rid of costly ECM systems that no longer meet their needs and use WebCenter, part of the Oracle Fusion Technology stack, with their other leading enterprise applications.USDM provides:•    Expertise in Life Science ECM Business Processes•    Prebuilt Life Science Configuration in WebCenter •    Validation Accelerator Packs for WebCenterUSDM is very proud to support Oracle’s expanding commitment to Life Sciences…. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} For more information please contact:  [email protected] Oracle will be exhibiting at DIA 2012 in Philadelphia on June 25-27. Stop by our booth (#2825) to learn more about the advantages of a centralized ECM strategy and see the Oracle WebCenter Content solution, our 21 CFR Part 11 compliant content management platform.

    Read the article

  • My Code Kata–A Solution Kata

    - by Glav
    There are many developers and coders out there who like to do code Kata’s to keep their coding ability up to scratch and to practice their skills. I think it is a good idea. While I like the concept, I find them dead boring and of minimal purpose. Yes, they serve to hone your skills but that’s about it. They are often quite abstract, in that they usually focus on a small problem set requiring specific solutions. It is fair enough as that is how they are designed but again, I find them quite boring. What I personally like to do is go for something a little larger and a little more fun. It takes a little more time and is not as easily executed as a kata though, but it services the same purposes from a practice perspective and allows me to continue to solve some problems that are not directly part of the initial goal. This means I can cover a broader learning range and have a bit more fun. If I am lucky, sometimes they even end up being useful tools. With that in mind, I thought I’d share my current ‘kata’. It is not really a code kata as it is too big. I prefer to think of it as a ‘solution kata’. The code is on bitbucket here. What I wanted to do was create a kind of simplistic virtual world where I can create a player, or a class, stuff it into the world, and see if it survives, and can navigate its way to the exit. Requirements were pretty simple: Must be able to define a map to describe the world using simple X,Y co-ordinates. Z co-ordinates as well if you feel like getting clever. Should have the concept of entrances, exists, solid blocks, and potentially other materials (again if you want to get clever). A coder should be able to easily write a class which will act as an inhabitant of the world. An inhabitant will receive stimulus from the world in the form of surrounding environment and be able to make a decision on action which it passes back to the ‘world’ for processing. At a minimum, an inhabitant will have sight and speed characteristics which determine how far they can ‘see’ in the world, and how fast they can move. Coders who write a really bad ‘inhabitant’ should not adversely affect the rest of world. Should allow multiple inhabitants in the world. So that was the solution I set out to act as a practice solution and a little bit of fun. It had some interesting problems to solve and I figured, if it turned out ok, I could potentially use this as a ‘developer test’ for interviews. Ask a potential coder to write a class for an inhabitant. Show the coder the map they will navigate, but also mention that we will use their code to navigate a map they have not yet seen and a little more complex. I have been playing with solution for a short time now and have it working in basic concepts. Below is a screen shot using a very basic console visualiser that shows the map, boundaries, blocks, entrance, exit and players/inhabitants. The yellow asterisks ‘*’ are the players, green ‘O’ the entrance, purple ‘^’ the exit, maroon/browny ‘#’ are solid blocks. The players can move around at different speeds, knock into each others, and make directional movement decisions based on what they see and who is around them. It has been quite fun to write and it is also quite fun to develop different players to inject into the world. The code below shows a really simple implementation of an inhabitant that can work out what to do based on stimulus from the world. It is pretty simple and just tries to move in some direction if there is nothing blocking the path. public class TestPlayer:LivingEntity { public TestPlayer() { Name = "Beta Boy"; LifeKey = Guid.NewGuid(); } public override ActionResult DecideActionToPerform(EcoDev.Core.Common.Actions.ActionContext actionContext) { try { var action = new MovementAction(); // move forward if we can if (actionContext.Position.ForwardFacingPositions.Length > 0) { if (CheckAccessibilityOfMapBlock(actionContext.Position.ForwardFacingPositions[0])) { action.DirectionToMove = MovementDirection.Forward; return action; } } if (actionContext.Position.LeftFacingPositions.Length > 0) { if (CheckAccessibilityOfMapBlock(actionContext.Position.LeftFacingPositions[0])) { action.DirectionToMove = MovementDirection.Left; return action; } } if (actionContext.Position.RearFacingPositions.Length > 0) { if (CheckAccessibilityOfMapBlock(actionContext.Position.RearFacingPositions[0])) { action.DirectionToMove = MovementDirection.Back; return action; } } if (actionContext.Position.RightFacingPositions.Length > 0) { if (CheckAccessibilityOfMapBlock(actionContext.Position.RightFacingPositions[0])) { action.DirectionToMove = MovementDirection.Right; return action; } } return action; } catch (Exception ex) { World.WriteDebugInformation("Player: "+ Name, string.Format("Player Generated exception: {0}",ex.Message)); throw ex; } } private bool CheckAccessibilityOfMapBlock(MapBlock block) { if (block == null || block.Accessibility == MapBlockAccessibility.AllowEntry || block.Accessibility == MapBlockAccessibility.AllowExit || block.Accessibility == MapBlockAccessibility.AllowPotentialEntry) { return true; } return false; } } It is simple and it seems to work well. The world implementation itself decides the stimulus context that is passed to he inhabitant to make an action decision. All movement is carried out on separate threads and timed appropriately to be as fair as possible and to cater for additional skills such as speed, and eventually maybe stamina, strength, with actions like fighting. It is pretty fun to make up random maps and see how your inhabitant does. You can download the code from here. Along the way I have played with parallel extensions to make the compute intensive stuff spread across all cores, had to heavily factor in visibility of methods and properties so design of classes was paramount, work out movement algorithms that play fairly in the world and properly favour the players with higher abilities, as well as a host of other issues. So that is my ‘solution kata’. If I keep going with it, I may develop a web interface for it where people can upload assemblies and watch their player within a web browser visualiser and maybe even a map designer. What do you do to keep the fires burning?

    Read the article

  • Recovering a lost website with no backup?

    - by Jeff Atwood
    Unfortunately, our hosting provider experienced 100% data loss, so I've lost all content for two hosted blog websites: http://blog.stackoverflow.com http://www.codinghorror.com (Yes, yes, I absolutely should have done complete offsite backups. Unfortunately, all my backups were on the server itself. So save the lecture; you're 100% absolutely right, but that doesn't help me at the moment. Let's stay focused on the question here!) I am beginning the slow, painful process of recovering the website from web crawler caches. There are a few automated tools for recovering a website from internet web spider (Yahoo, Bing, Google, etc.) caches, like Warrick, but I had some bad results using this: My IP address was quickly banned from Google for using it I get lots of 500 and 503 errors and "waiting 5 minutes…" Ultimately, I can recover the text content faster by hand I've had much better luck by using a list of all blog posts, clicking through to the Google cache and saving each individual file as HTML. While there are a lot of blog posts, there aren't that many, and I figure I deserve some self-flagellation for not having a better backup strategy. Anyway, the important thing is that I've had good luck getting the blog post text this way, and I am definitely able to get the text of the web pages out of the Internet caches. Based on what I've done so far, I am confident I can recover all the lost blog post text and comments. However, the images that go with each blog post are proving…more difficult. Any general tips for recovering website pages from Internet caches, and in particular, places to recover archived images from website pages? (And, again, please, no backup lectures. You're totally, completely, utterly right! But being right isn't solving my immediate problem… Unless you have a time machine…)

    Read the article

  • SQL SERVER – Database in RESTORING State for Long Time

    - by Pinal Dave
    A very interesting question I received the other day. “Our database has been in restoring stage for a long time. We have already restored all the necessary files there. After restoring the files we are expecting that  the database will be in operational mode, however, it is continuously in the restoring mode. Any suggestion?” The question is very common. I sent user follow up emails to understand what is actually going on with the user. I realized after restoring their bak files and log files their database was in the restoring state because they had not restored the latest log file with RECOVERY options. As they had completed all the database restore sequence (bak and log in order), the real need for them was to recover the database from norecovery state. User can restore log files till the database is no recovery mode. If the database is recovered it will be in operation and it can continue database operation. If the database has another operations we cannot restore further log as the chain of the log file after the database is recovered is meaningless. This is the reason why the database has to be norecovery state when it is restored. There are three different ways to recover the database. 1) Recover the database manually with following command. RESTORE DATABASE database_name WITH RECOVERY 2) Recover the database with the last log file. RESTORE LOG database_name FROM backup_device WITH RECOVERY 3) Recover the database when bak is restored RESTORE DATABASE database_name FROM backup_device WITH RECOVERY To understand how the backup restores timeline works read Backup Timeline and Understanding of Database Restore Process in Full Recovery Model. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Backup and Restore, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Duplicity can't connect to CloudFiles "Network is unreachable"

    - by jwandborg
    Whenever I click "Backup now" in the Backup GUI, the smaller "Back Up" window opends, and after a while I get the following error message: Traceback (most recent call last): File "/usr/bin/duplicity", line 1359, in with_tempdir(main) File "/usr/bin/duplicity", line 1342, in with_tempdir fn() File "/usr/bin/duplicity", line 1202, in main action = commandline.ProcessCommandLine(sys.argv[1:]) File "/usr/lib/python2.7/dist-packages/duplicity/commandline.py", line 942, in ProcessCommandLine globals.backend = backend.get_backend(args[0]) File "/usr/lib/python2.7/dist-packages/duplicity/backend.py", line 156, in get_backend return _backends[pu.scheme](pu) File "/usr/lib/python2.7/dist-packages/duplicity/backends/cloudfilesbackend.py", line 70, in __init__ self.container = conn.create_container(container) File "/usr/lib/python2.7/dist-packages/cloudfiles/connection.py", line 250, in create_container response = self.make_request('PUT', [container_name]) File "/usr/lib/python2.7/dist-packages/cloudfiles/connection.py", line 189, in make_request response = retry_request() File "/usr/lib/python2.7/dist-packages/cloudfiles/connection.py", line 182, in retry_request self.connection.request(method, path, data, headers) File "/usr/lib/python2.7/httplib.py", line 955, in request self._send_request(method, url, body, headers) File "/usr/lib/python2.7/httplib.py", line 989, in _send_request self.endheaders(body) File "/usr/lib/python2.7/httplib.py", line 951, in endheaders self._send_output(message_body) File "/usr/lib/python2.7/httplib.py", line 811, in _send_output self.send(msg) File "/usr/lib/python2.7/httplib.py", line 773, in send self.connect() File "/usr/lib/python2.7/httplib.py", line 1154, in connect self.timeout, self.source_address) File "/usr/lib/python2.7/socket.py", line 571, in create_connection raise err error: [Errno 101] Network is unreachable I use Rackspace CloudFiles as a storage backend, last backup was 3 days ago (successful. I have not changed any settings since then.

    Read the article

  • Questions about norton ghost

    - by Nrew
    I have used norton ghost to backup my hard drive. And it came up with a .v2i file. Will I be able to use this backup from my pc to my laptop? Can I use this backup to restore my dual boot pc back in shape. If the mbr is destroyed/damaged? Can I use this backup to restore my os and applications on the same machine if the machine's hard drive is reformatted?

    Read the article

  • rsnapshot preexec

    - by Zulakis
    I am mounting my remote backup volume using a rsnapshot cmd_preexec script. If the /mnt/backup directory doesn't exist when starting rsnapshot i get this error: ERROR: /mnt/backup does not exist. If the directory exists and the preexec mounting fails, it does not stop rsnapshot resulting in the backup being backed up on the completely wrong server... What should I do about this? Edit: I know that I could use a wrapper-script, but I don't want to do this..

    Read the article

  • File copying software to do this kind of work... in Windows 7 32 bit

    - by Senthil
    I need a software (Windows 7 32bit) to help me in this process: I have my documents, music, video clips, movies, pictures in my hard disk. These will not be scattered around the system. But will be inside C:\Senthil\ At the end of every week, I want to plug in an external hard disk and run a software that should make sure whatever is inside C:\Senthil\ is also present in the external disk. Files deleted from C:\Senthil\ should be deleted there, and new files should be copied etc... at the end of the process, every bit inside the source folder in my internal disk should be inside my external disk. A couple of important requirements and points: I do NOT need multiple versions or historic versions. I don't need the previous versions of my files. I only want the latest copy to be present in my "backup". Incremental backup makes sense. If files were not touched since the last backup, it need not copy. The size of my folder will run into GBs and in a year or two will go into TBs. But I will make sure the size of the external HDD is equal to or bigger than my source folder. I do not want it to run automatically because when I accidentally delete a file in my source, it will delete the one in the backup (I know this is why we have versioning facilities). I just want to be able to run it manually so that I am in control of when the backup is made and what is backed up and I should be able to pick something from the backup and restore it to the source folder in the above situation. Is there any software that will let me do exactly this? I don't want any other "smart" facility of the software to interfere with this process. I know what I want and the software can keep its smartness to itself :D The main reason I am asking this question is, I am a software developer and I can write this software myself. But I am a little constrained by time at the moment and I want to know if there is an existing program that can do this. Kindly don't worry about earthquakes or fire or snowstorms and bring up the "in case of a natural disaster your backup will also be in the damage zone and will be lost" argument because: I will have bigger things to worry about than my holiday memories. I don't think I will digitally store any life-ruining documents. This backup is only to avoid the inconvenience of obtaining a new copy of stuff that I have. Not to protect them against the end of the world. I am more worried about power surges in my area frying my system, hard disk failure, children who merrily hit Delete or teens who hit Shift + Delete or myself getting a little careless at times! In short: Is there a file/folder syncing software that listens to what I say and doesn't try to act smart? Please forgive me if I sound arrogant :)

    Read the article

  • Backing up a 22 GB MySQL database daily

    - by unknown (yahoo)
    Right now I am able to do the backup using mysqldump. But I have to take down the web server AND it takes around 5 minutes to do the backup. If I don't take down the web server, it takes forever and never finishes + the website becomes inaccessible during the backup. Is there a quicker/better way to backup my 22 GB and growing database? All the tables are MyISAM.

    Read the article

  • SQL SERVER – Size of Index Table for Each Index – Solution 3 – Powershell

    - by pinaldave
    Laerte Junior If you are a Powershell user, the name of the Laerte Junior is not a new name. He is the one man with exceptional knowledge of Powershell. He is not only very knowledgeable, but also very kind and eager to those in need. I have been attempting to setup Powershell for many days, but constantly facing issues. I was not able to get going with this tool. Finally, yesterday I sent email to Laerte in response to his comment posted here. Within 5 minutes, Laerte came online and helped me with the solution. He spend nearly 15 minutes working along with me to solve my problem with installation. And yes, he did resolve it remotely without looking at my screen – What a skilled and exceptional person!! I will soon post a detail note about the issue I faced and resolved with the help of Laerte. Here is his solution to my earlier puzzle in his own words. Read the original puzzle here and Laerte’s solution from here. Hi Pinal, I do not say better, but maybe another approach to enthusiasts in powershell and SQLSPX library would be: 1 – All indexes in all tables and all databases Get-SqlDatabase -sqlserver “Yourserver” | Get-SqlTable | Get-SqlIndex | Format-table Server,dbname,schema,table,name,id,spaceused 2 – All Indexes in all tables and specific database Get-SqlDatabase -sqlserver “Yourserver” “Yourdb” | Get-SqlTable | Get-SqlIndex | Format-table Server,dbname,schema,table,name,id,spaceused 3 – All Indexes in specific table and database Get-SqlDatabase -sqlserver “Yourserver” “Yourdb” | Get-SqlTable “YourTable” | Get-SqlIndex | Format-table Server,dbname,schema,table,name,id,spaceused and to output to txt.. pipe Out-File Get-SqlDatabase -sqlserver “Yourserver” | Get-SqlTable | Get-SqlIndex | Format-table Server,dbname,schema,table,name,id,spaceused | out-file c:\IndexesSize.txt If you have one txt with all your servers, can be for all of them also. Lets say you have all your servers in servers.txt: something like NameServer1 NameServer2 NameServer3 NameServer4 We could Use : foreach ($Server in Get-content c:\temp\servers.txt) { Get-SqlDatabase -sqlserver $Server | Get-SqlTable | Get-SqlIndex | Format-table Server,dbname,schema,table,name,id,spaceused } :) After fixing my issue with Powershell, I ran Laerte‘s second suggestion – “All Indexes in all tables and specific database” and found the following accurate output. Click to Enlarge Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Index, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Powershell

    Read the article

  • SQL SERVER – Size of Index Table for Each Index – Solution 3 – Powershell

    - by pinaldave
    Laerte Junior If you are a Powershell user, the name of the Laerte Junior is not a new name. He is the one man with exceptional knowledge of Powershell. He is not only very knowledgeable, but also very kind and eager to those in need. I have been attempting to setup Powershell for many days, but constantly facing issues. I was not able to get going with this tool. Finally, yesterday I sent email to Laerte in response to his comment posted here. Within 5 minutes, Laerte came online and helped me with the solution. He spend nearly 15 minutes working along with me to solve my problem with installation. And yes, he did resolve it remotely without looking at my screen – What a skilled and exceptional person!! I will soon post a detail note about the issue I faced and resolved with the help of Laerte. Here is his solution to my earlier puzzle in his own words. Read the original puzzle here and Laerte’s solution from here. Hi Pinal, I do not say better, but maybe another approach to enthusiasts in powershell and SQLSPX library would be: 1 – All indexes in all tables and all databases Get-SqlDatabase -sqlserver “Yourserver” | Get-SqlTable | Get-SqlIndex | Format-table Server,dbname,schema,table,name,id,spaceused 2 – All Indexes in all tables and specific database Get-SqlDatabase -sqlserver “Yourserver” “Yourdb” | Get-SqlTable | Get-SqlIndex | Format-table Server,dbname,schema,table,name,id,spaceused 3 – All Indexes in specific table and database Get-SqlDatabase -sqlserver “Yourserver” “Yourdb” | Get-SqlTable “YourTable” | Get-SqlIndex | Format-table Server,dbname,schema,table,name,id,spaceused and to output to txt.. pipe Out-File Get-SqlDatabase -sqlserver “Yourserver” | Get-SqlTable | Get-SqlIndex | Format-table Server,dbname,schema,table,name,id,spaceused | out-file c:\IndexesSize.txt If you have one txt with all your servers, can be for all of them also. Lets say you have all your servers in servers.txt: something like NameServer1 NameServer2 NameServer3 NameServer4 We could Use : foreach ($Server in Get-content c:\temp\servers.txt) { Get-SqlDatabase -sqlserver $Server | Get-SqlTable | Get-SqlIndex | Format-table Server,dbname,schema,table,name,id,spaceused } :) After fixing my issue with Powershell, I ran Laerte‘s second suggestion – “All Indexes in all tables and specific database” and found the following accurate output. Click to Enlarge Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Index, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Powershell

    Read the article

  • SQL SERVER – Size of Index Table for Each Index – Solution 2

    - by pinaldave
    Earlier I had ran puzzle where I asked question regarding size of index table for each index in database over here SQL SERVER – Size of Index Table – A Puzzle to Find Index Size for Each Index on Table. I had received good amount answers and I had blogged about that here SQL SERVER – Size of Index Table for Each Index – Solution. As a comment to that blog I have received another very interesting comment and that provides near accurate answers to original question. Many thanks to Rama Mathanmohan for providing wonderful solution. SELECT OBJECT_NAME(i.OBJECT_ID) AS TableName, i.name AS IndexName, i.index_id AS IndexID, 8 * SUM(a.used_pages) AS 'Indexsize(KB)' FROM sys.indexes AS i JOIN sys.partitions AS p ON p.OBJECT_ID = i.OBJECT_ID AND p.index_id = i.index_id JOIN sys.allocation_units AS a ON a.container_id = p.partition_id GROUP BY i.OBJECT_ID,i.index_id,i.name ORDER BY OBJECT_NAME(i.OBJECT_ID),i.index_id Let me know if you have any better script for the same. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Pinal Dave, Readers Contribution, SQL, SQL Authority, SQL Data Storage, SQL Index, SQL Performance, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • VMware sort VMware Go Pro sa solution de virtualisation Cloud pour PME, passerelle vers VMware vSphere

    VMware sort VMware Go Pro Sa solution de virtualisation Cloud pour PME, passerelle vers VMware vSphere VMware vient d'annoncer la sortie de sa solution de virtualisation Cloud pour les PME : VMware Go Pro. VMware Go Pro a pour principal objectif de faciliter les efforts de virtualisation des petites et moyennes entreprises en leur proposant un outil ergonomique, qui simplifie la gestion des systèmes d'information et qui veut améliorer la productivité. L'outil intègre une console centrale unique, pour fédérer et simplifier l'administration des infrastructures physiques et virtuelles. Une console qui permet de « libérer les équipes des tâches récurrentes afin de se consacr...

    Read the article

  • Visual Studio 2008 Solution Setup

    - by Ben Griswold
    In this screencast, Noah and I demonstrate preferred practices around .NET solution setup, naming conventions and version control.  I consider this an introductory video.  If you’ve been around the block, you might want to skip this episode but if you’re a .NET/Visual Studio newbie, it may be worth a look.    YouTube - Visual Studio 2008 Solution Setup   This is one of our first screencasts.  Actually it is the very first.  If you have feedback, I’d love to hear it.

    Read the article

  • SQL SERVER – Solution – 2 T-SQL Puzzles – Display Star and Shortest Code to Display 1

    - by pinaldave
    Earlier on this blog we had asked two puzzles. The response from all of you is nothing but Amazing. I have received 350+ responses. Many are valid and many were indeed something I had not thought about it. I strongly suggest you read all the puzzles and their answers here - trust me if you start reading the comments you will not stop till you read every single comment. Seriously trust me on it. Personally I have learned a lot from it. Let us recap the puzzles here quickly. Puzzle 1: Why following code when executed in SSMS displays result as a * (Star)? SELECT CAST(634 AS VARCHAR(2)) Puzzle 2: Write the shortest code that produces results as 1 without using any numbers in the select statement. Bonus Q: How many different Operating System (OS) NuoDB support? As I mentioned earlier the participation was nothing but Amazing. I will write about the winners and the best answers in short time. Meanwhile I will give to the point answers to above puzzles. Solution 1: When you convert character or binary expressions (char, nchar, nvarchar, varchar,binary, or varbinary) to an expression of a different data type, data can be truncated, only partially displayed, or an error is returned because the result is too short to display. Conversions to char, varchar, nchar, nvarchar, binary, and varbinary are truncated, except for the conversions shown in the following table. Reference of the text and table from MSDN. Solution 2: The shortest code to produce answer 1 : SELECT EXP($) or SELECT COS($) or SELECT DAY($) When SELECT $ it gives us the result as 0.00 and the EXP of the same is 1. I believe it is pretty neat. There were plenty other answers but this was the shortest. Another shorter answer would be PRINT EXP($) but no one has proposed that as in original Question I have explicitly mentioned SELECT in the original question. Bonus Answer: 5 OS: Windows, MacOS, Linux, Solaris, Joyent SmartOS Reference Please do read every single comment here. Do leave a comment which one do you think is the best comment out of all the comments. Meanwhile if there is a better solution and I have missed it do let me know as we still have time to correct it. I will be selecting the winner before the weekend as I am going through each and every of 350 comment. I will be selecting the best comments along with the winning comment. If our selection matches – one of you may still win something cool.  Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology Tagged: NuoDB

    Read the article

  • A Complete Customer Experience Solution (3 of 3 in 'No Customer Left Behind' Series)

    - by Kathryn Perry
    A guest post by David Vap, Group Vice President, Oracle Applications Product Development In my previous post, I talked about taking three concrete steps to improve your customers' overall experiences: 1) understand your customer, 2) empower your ecosystem, and 3) adapt your business. To do these effectively and efficiently, it's important to find the right technology that can bridge the gaps across your channels, interactions, departments, and repositories. Oracle has spent the past three years and more than six billion dollars acquiring and developing some of the world's best-of-breed applications. The result is the most comprehensive customer experience (CX) portfolio offering in the World - bar none: ATG Best in Class Selling Experiences Fatwire Best in Class Marketing Experiences Inquira Best in Class Support Experiences Endecca Best in Class Search Experiences RightNow Best in Class Service Experiences Vitrue & Involver Best in Class Social Marketing Collective Intellect Best In Class Social Listening We don't expect organizations to eat the CX elephant in one bite, nor should they try to. There are key strategic initiatives within each of the four main pillars of our customer experience offering for which we deliver solutions: 1. Customer Experience for Marketing Social Listening and Engagement Social Marketing Marketing Websites Demand Generation and Lead Management Marketing and Loyalty Management 2. Customer Experience for Commerce Search, Navigation & Content Delivery Cross-Channel Commerce Targeting & Product Recommendations Social Commerce Order Management & Fulfillment Retail Store Operations 3. Customer Experience for Sales Sales Force Automation Social Selling Territory & Quota Management Revenue Forecasting Partner Relationship Management Quote to Cash Incentive Compensation 4. Customer Experience for Service Cross-Channel Customer Service Knowledge Management Social Customer Service Eligibility Management Contracts, Assets, and Entitlements Industry-Specific Solutions eBilling Oracle's customer experience portfolio is socially infused at each layer of our pillars rather than simply bolted on as a side process. This combines with the power of the Cloud to run the parts of the solution that need the access, efficiency, and agility from a managed infrastructure. You can get the compliance control from on-premise backbone infrastructure systems that run your business and don't change that often. Please take advantage of our teams of Oracle customer experience professionals and our key agency and technology partner ecosystem. They can help you develop strategic solution roadmaps that build and deliver customer experience and that are tailored to your business needs and objectives. No one has built a better customer service portfolio to manage the entire customer journey than Oracle. It is backed by CX thought leadership programs, a commitment from our executives, and a worldview that your technology decisions must be driven by your customer experiences to succeed. If you’d like to follow up on this conversation, please leave a comment or contact me at [email protected]. You can get more information on Oracle’s complete customer experience solution here.

    Read the article

  • Product News: Oracle Unveils a Waste Management Solution for the Oracle E-Business Suite

    - by Evelyn Neumayr
    Oracle recently announced a new product to help organizations reduce the cost and compliance with international hazmat (short for hazardous materials) and recycling and environmental protection laws. This new waste management solution for Oracle E-Business Suite extends the capabilities of  Oracle Depot Repair, Oracle Transportation Management and Oracle Global Trade Management. It automates and monitors waste management processes to help ensure that hazardous materials are tracked and handled in accordance with regulatory requirements. Oracle’s waste management solution for the Oracle E-Business Suite leverages Oracle Transportation Management and Oracle Global Trade Management, enabling customers to view in-transit inventory across the extended supply chain, while also providing a single repository for all legal, regulatory and compliance related information. Read here for more information.

    Read the article

  • Sage lance Sage CRM Express, une solution de gestion des clients spécialement conçue pour les PME

    Sage lance Sage CRM Express Une solution de gestion des clients pour les PME Sage a réalisé une étude auprès de ses clients qui montre que 80% des PME disposent de moins de 4 commerciaux et sont en attente d'un outil pour leur suivi clients qui peut s'interfacer facilement à leur solution de gestion commerciale existante. Pour répondre à cette attente, Sage a donc annoncé la sortie de Sage CRM Express. Intégrée de manière native à Sage 100 ou proposée en mode stand-alone, cette nouvelle offre s'adresse donc aux PME de moins de 4 commerciaux. Sage CRM Express est un outil assez simplequi tente de répondre à l'ensemble du processus client (avant-vente, vente, recouvrement, ...

    Read the article

< Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >