Search Results

Search found 6226 results on 250 pages for 'enterprise visualization'.

Page 172/250 | < Previous Page | 168 169 170 171 172 173 174 175 176 177 178 179  | Next Page >

  • Script/scheduled task to clean up Dowloads folder (Windows 7)?

    - by Andrew
    My downloads folder is my primary connection to the internet (and thus the world), and has rapidly become a virtual junk drawer. I imagine I'm not alone here. What I'd like to do is have some sort of a script that -creates a new folder (named YYYY_MM Archive so it sorts nicely) -takes everything* inside of the Downloads folder (for the current user) -moves it to the new folder and have this run once/month. *(excluding previous archives) This strikes me as the type of thing Saw some answers on related questions that seemed to suggest that this is an eminently doable task in PowerShell. Is it? I'm on Windows 7 Enterprise.

    Read the article

  • Windows Remote Desktop with multiple monitors

    - by BDW
    I'm looking for a remote desktop product that will allow me to switch on the fly between how many monitors the desktop is using. This would be going from one Windows 7 machine to another, one is running Ultimate, the other Enterprise. I know I can use Windows RDP with multi-monitor support, but it seems that only allows me to use ALL monitors, and if I switch to a windowed view, I get huge scrollbars because I'm going from 3 monitors to 1 window. What I'm looking for is something that will allow me to switch from 1, 2, or 3 monitor displays without restarting the RDP and would end up with a true 1/2/3 display - ie, if I go from 3 monitors to 1, or a window, I don't get scrollbars. Is there any such product?

    Read the article

  • Sticking with Ubuntu 12.04 while heavily using PPA for newest software updates (Apache 2.4, PHP 5.5)

    - by MechaStorm
    I was wondering whether is it worthwhile to stick with Ubuntu 12.04 LTS until 14.04 comes or should I be switching to just the latest Ubuntu server version 13.10. My server needs are not enterprise heavy and previous thought to keep with LTS was simply to gain the security updates without having to upgrade the servers every couple months. But as we are moving forward with our software development, I have found that alot of the default version of software with 12.04 is way out of the date forcing me to up date via PPA or from source instead of from default apt-get. ie PHP 5.3 is on 12.04, and I'd like to get it to 5.5. Is it worthwhile to simply move to 13.10 in that situation? With the idea to move to 14.04 when it comes?

    Read the article

  • hosting website on a private network

    - by razor
    i'm currently running a website off 3 linux servers. I'd like to setup a private network and only allow port 80 traffic to one of the servers. I'd also like to setup a vpn so only I can access the servers via ssh or any port for developing/debugging. How hard is this to setup and what do I need to get? Do enterprise/commercial routers have vpn functionality built in? how do I handle DNS? eg- www.mydomain.com would need to point to the router, which forwards traffic to the webserver. Do I set the A record to the router, and somehow tell the router which server to send the http request to? And how would I make server1.mydomain.com resolve to server1 within the private network (without editing host files)? Would I need to run my own DNS (eg- powerdns?) to do this?

    Read the article

  • Users own mapped network drives disappear when I set a GPP mapped drive

    - by Kim
    All the clients use Windows 7 SP1 x64 Enterprise. The domain controllers are Windows Server 2008 R2. I have configured the GPP to map "\server\data" to first available drive letter starting with I:. The action is replace and I have set the Hide/Show this drive and Hide/Show all drives to "Show". I have set targeting to a specified security group. This works as expected and the drive is mapped to the correct users. The problem is that if the user has created their own mapped drives these mappings will disappear when the GPP mapping is applied. Only the mapped drives from the GPP is shown in Explorer. I have not found any other mention of this particular problem when I search the Internet and on TechNet there is no mention of what happens to drives already mapped.

    Read the article

  • why run a Linux shell command with &?

    - by George2
    I am using Red Hat Linux Enterprise version 5. Sometimes I notice people run command with a couple of & options. For example, in the below command, there are two &-signs. What are the function of them? Are they always used together with nohup? nohup foo.sh <parameters to specific the scripe> >& <log_file_name> & thanks in advance, George

    Read the article

  • How to effectively secure a dedicated server for intranet use?

    - by Mark
    I need to secure a dedicated server for intranet use, the server is managed so will have software based security, but what other security should be considered for enterprise level security? The intranet is a host for an ECM (Alfresco) managing and storing sensitive documents. As the information is sensitive we are trying to make it as secure as reasonably possible (requirement in UK law). We plan to encrypt the data on the database. It will be connected to via SSL encryption. Should we consider Hardware firewall, Private lan between the application server and database server?

    Read the article

  • Lock manager stops responding (lockd/nfslock), but shows as running

    - by dwaynehoov
    Essentially, lock manager stops responding (lockd/nfslock), but shows as running as a kernel process. If I bounce portmap and nfslock, it has no effect. Doesn't show up in the portmapper registered services rpcinfo -p .. doesn't show nlockmgr. it just shows portmapper and status If I manually remount the drives, it fixes the issue. I'm assuming that the service (lockd) goes stale or hangs when there is no NFS activity? It seems like issuing a mount for NFS volumes "awakens" it and things work once that happens. Please help me nail this down for point me to somewhere to get more information on what might be happening. System info: Linux xxx.yyy.com 2.6.32-300.38.1.el5uek #1 SMP Thu Oct 18 11:51:13 PDT 2012 x86_64 x86_64 x86_64 GNU/Linux cat /etc/redhat-release Red Hat Enterprise Linux Server release 5.8 (Tikanga) cat /etc/oracle-release Oracle Linux Server release 5.8 Thanks

    Read the article

  • Ctrl-Alt-F1 never gives prompt

    - by nispio
    When I press Ctrl+Alt+F1 in my Red Hat Enterprise Linux 6.4 (Santiago) I get a black screen with a blinking cursor, but I never get a login prompt. I can get back to my desktop with Ctrl+Alt+F7, and things work as normal. However, when I run sudo init 3 I get the same thing, but this time I am stuck at the screen and can't get back without a reboot. I'm not even sure where to start looking for the cause of this problem. Are the startup scripts that might be causing the problem or other system configurations that could lead to this? If this is not an known issue, what steps should I take to debug?

    Read the article

  • reclaim space after moving indexes to file group

    - by Titan2782
    I have an extremely large database and most of the space is the index size. I moved several indexes to a different file group (just to experiment) but no matter what I do I cannot reduce the size of the MDF. I tried shrink database, shrink files, rebuilding clustered index. What can I do to reclaim that space in the MDF? I've moved 15GB worth of indexes to a different file group. Is it even possible to reduce my mdf by that same 15gb (or close to it)? SQL Server 2008 Enterprise

    Read the article

  • my.ini optimization on Windows 2008 R2 VPS

    - by MKphpDev
    I have a vmware VPS running Windows Server 2008 R2 Enterprise that has performance issues with MySQL. Every few minutes, MySQL stall for few seconds then responed to queries. I'm sure that my.ini need to be optimized, but unfortunately, I don't have any idea of my.ini configuration. What's running on the server: 2 small wordpress blogs, 1 vbulletin forums (approx. 1.2 GB database, and increasing), small database for some sort of plug-ins (no more than 4000 records) Server Info: Processor: Intel Xeon X5550 @ 2.67GHz, RAM: 6 GB (memory useage never exceeded 2 GB), MySQL 5.5, PHP 5.3.10, IIS 7 current my.ini: [mysqld] default-storage-engine=INNODB sql-mode="STRICT_TRANS_TABLES,NO_AUTO_CREATE _USER,NO_ENGINE_SUBSTITUTION" max_connections=250 myisam_max_sort_file_size=20G innodb_additional_mem_pool_size=256M innodb_flush_log_at_trx_commit=1 innodb_log_buffer_size=8M innodb_buffer_pool_size=512MB innodb_log_file_size=128M innodb_thread_concurrency=10 key_buffer_size = 512M myisam_sort_buffer_size = 8M join_buffer_size = 256K read_buffer_size = 256K sort_buffer_size = 256K table_cache = 4000 thread_cache_size = 200 wait_timeout = 30 connect_timeout = 10 tmp_table_size = 32M max_allowed_packet = 1M max_connect_errors = 10000 query_cache_size = 16M query_cache_limit = 2M query_cache_type = 1 query_cache_min_res_unit = 1024 query_prealloc_size = 16384 query_alloc_block_size = 16384 skip-external-locking read_rnd_buffer_size=1M max_heap_table_size=16M thread_concurrency=8 [mysqld_safe] open_files_limit = 8192 [mysqldump] quick max_allowed_packet = 16M [myisamchk] key_buffer_size = 128M sort_buffer_size = 128M read_buffer = 2M write_buffer = 2M any help with that, please?

    Read the article

  • Will readyboost speed up a secondary partion or harddrive?

    - by Sebastian
    I'm building a new computer and plan on using a few spare USB sticks with readyboost to cache disk writes. I'll be running Windows 7 Enterprise SP1 x64 I have a single 2 TB disk, and plan to make a partition for windows (100 GB) and use the rest for data. I know Readyboost will work nicely for the C drive, but am unable to find any information on whether it will accelerate a second drive or partition. Just to clarify, I'm NOT trying to use the harddrive instead of the usb sticks. I'm trying to speed up the second partition using the USB sticks So, will readyboost work for a secondary partion or harddrive?

    Read the article

  • How security of the systems might be improved using database procedures?

    - by Centurion
    The usage of Oracle PL/SQL procedures for controlling access to data often emphasized in PL/SQL books and other sources as being more secure approach. I'v seen several systems where all business logic related with data is performed through packages, procedures and functions, so application code becomes quite "dumb" and is only responsible for visualization part. I even heard some devs call such approaches and driving architects as database nazi :) because all logic code resides in database. I do know about DB procedure performance benefits, but now I'm interested in a "better security" when using thick client model. I assume such design mostly used when Oracle (and maybe MS SQL Server) databases are used. I do agree such approach improves security but only if there are not much users and every system user has a database account, so we might control and monitor data access through standard database user security. However, how such approach could increase the security for an average web system where thick clients are used: for example one database user with DML grants on all tables, and other users are handled using "users" and"user_rights" tables? We could use DB procedures, save usernames into context use that for filtering but vulnerability resides at the root - if the main database account is compromised than nothing will help. Of course in a real system we might consider at least several main users (for example frontend_db_user, backend_db_user).

    Read the article

  • Schedule robocopy run

    - by xeonet
    I have Windows Server 2008 Enterprise. I need to copy files from network folder, I connected as a Z: drive. I need to schedule the copy. In scheduler I run it every 5 minutes. robocopy.exe Z:\ C:\destination /E I've tried to put it to .bat file, tried to write in scheduler, it doesn't help. I've set run with highest privilegies... Task Scheduler successfully completed task "\RoboCopy" , instance "{dd2d2d1c-4ef1-4e30-b226-4a77aa52dab9}" , action "C:\Windows\SYSTEM32\cmd.exe" with return code 16.

    Read the article

  • sa2 -A /var/log/sa/sa13: No such file or directory

    - by user53925
    I have systat version 7.0.2 and the /etc/sysconfig/sysstat has the entry HISTORY=27, this is on a redhat enterprise server 5.6, the cron setup for this is # run system activity accounting tool every minute * * * * * root /usr/lib64/sa/sa1 1 1 # generate a daily summary of process accounting at 23:53 53 23 * * * root /usr/lib64/sa/sa2 -A I get the following error from the cron sa2 -A find: /var/log/sa/sa13: No such file or directory, Looking at the directory /var/log/sa the files are created from sa01 through sa10 (sa1 created on sep1, sa2 created on sep2 and so on), then the rest of the files are from sa14 through to sa 31 (created from Aug 14 to Aug 31). I have not made any changes on the server so I am not sure why I am getting these error messages and is there a way to fix this?. Someone suggested creating empty files from sa11 through sa14 to fix this but I am not sure if this might mess up something .

    Read the article

  • Missing Desktop Icon Labels in Windows 7

    - by Buzzedword
    Hey guys. Big problem that has been bothering me to no end. My desktop icons have been stripped of their icon labels, and nothing seems to be recovering them. To be clear, when I attempt to rename, the text shows up, and when I view my desktop in an explorer window, all text is preserved. System restore to a stock state will not recover. No changes have been done to the computer-- no installs or downloads for two weeks prior to this error. Rebuilt the icon cache, still no response. Anybody know what can be causing this problem? Screenie below. Thanks in advance. OS: Windows 7 Enterprise 64-bit Profile: Local, non-roaming Image: http://img688.imageshack.us/img688/5152/capturemr.png

    Read the article

  • Export policy configuration from eTrust ITM 8.1

    - by grub
    Currently we are using the Enterprise AV Solution eTrust ITM 8.1. The licenses are running out in october and we are going to replace eTrust with another AV solution. The eTrust Server is running on Windows Server 2003 SP2 with an MS SQL 2000 Standard Edition. The problem is that we've got many different policy - sets which we have to redefine in the new AV solution. Is there any way with eTrust ITM 8.1 to export the different policies as csv, pdf ... whatever? I really dont want to do that manually (that would mean one print screen after the other ;-) ) Thank you very much. grub

    Read the article

  • Can i use a Windows 2008 r2 Cluster for file redundancy

    - by JERiv
    I'm researching a sever clustering architecture as a redundancy and backup solution for a client, and something that isn't made clear is whether or not i can use server clustering to replace a file server with backup solution. Forgive my Elementary understanding of server clustering but supposing: 2 Sites (NJ, CA) Identical Servers at each site setup as a Remote Site Cluster nodes with Windows Enterprise server 2008 r2 Services: File, Terminal, AD, and maybe DNS Will the following will be true: Files (including data drives) will be synced between the two servers eliminating the need for third party backup/mirroring software to sync/backup files. Also supposing i use roaming profiles w/ folder redirection; How will client computer in the WAN access their data through the cluster (i.e. will they automatically choose the best route)

    Read the article

  • Interconnection between 2 computers in different networks.

    - by cripox
    Hi, What I want is to connect 2 computers (work and personal) primary for using a software KVM (Input Director or Synergy). Transferring files between them would be a plus. The main issue is that the work computer is in a secured enterprise network, and my personal computer is using a 3G+ modem for Internet access. On the work computer I do not have Internet access (only local network). I want to somehow connect them without to mess up either networks. I want my personal computer to not be seen in the work network. Is it possible? Suggestions: - use a simple UTP cable to connect the 2 computers with each other. Can they each be in both 2 networks without issues? - use some kind of usb cable, if exists

    Read the article

  • We have a Solaris 9 server running Oracle 10G and have been getting memory consumption errors for a few weeks now

    - by another_netadmin
    We recently upgraded our Enterprise application and everything worked ok until one weekend when we did a server reboot, ever since then we have run into memory errors. The server has 4GB of physical memory installed and the kernel parameters are set to the following (/etc/system). I'm not an Oracle guy so I'm not sure where to start looking but any informaiton is greatly appreciated. Thanks in advance. There are two databases running on this server, one is a production database and the other is a pre-production database. [root@bandb /]# cat /etc/system | grep seminfo set semsys:seminfo_semmni=100 set semsys:seminfo_semmns=2048 set semsys:seminfo_semmsl=400 set semsys:seminfo_semopm=100 set semsys:seminfo_semvmx=32767 [root@bandb /]# cat /etc/system | grep shminfo set shmsys:shminfo_shmmax=4294967295 set shmsys:shminfo_shmmin=1 set shmsys:shminfo_shmmni=100 set shmsys:shminfo_shmseg=10 [root@bandb /]#

    Read the article

  • OTN Developer Days - Calgary, Alberta March 18 & Atlanta, GA April 1

    - by dana.singleterry
    Discover a Faster Way to Develop Ajax -Enabled Application Based on Java and SOA Standards Get Hands-on with Oracle Jdeveloper, Oracle Application Developer Framework and Oracle Fusion Middleware 11g. You are invited to attend Oracle Technology Network (OTN) Developer Day, a free, hands-on workshop that will give you insight into how to create Ajax-enabled rich Web user interfaces and Java EE-based SOA services with ease. We'll introduce you to the development platform Oracle is using for its Fusion enterprise applications, and show you how to get up to speed with it. The workshop will get you started developing with the latest versions of Oracle JDeveloper and Oracle ADF 11g, including the Ajax-enabled ADF Faces rich client components. Thursday, March 18, 2010 8:00 a.m. - 5:00 p.m. Calgary Marriott hotel 110 9th Avenue, SE Calgary, Alberta T2G 5A6 Wednesday, April 1, 2010 8:00 a.m. - 5:00 p.m. Four Seasons Hotel Atlanta 75 Fourteenth Street Atlanta, Georgia 30309 This workshop is designed for developers, project managers, and architects. Whether you are currently using Java, traditional 4GL tools like Oracle Forms, PeopleTools, and Visual Basic, or just looking for a better development platform - this session is for you. Get explanation from Oracle experts, try your hands at actual development, and get a chance to win an Apple iPod Touch and Oracle prizes. Come see how Oracle can help you deliver cutting edge UIs and standard -based applications faster with the Oracle Fusion Development software stack. At this event you will: * Get to know the Oracle Fusion development architecture and strategy from Oracle's experts. * Learn the easy way to extend your existing development skill sets to incorporate new technologies and architectures that include Service-Oriented Architecture, Java EE, and Web 2.0 * Participate in hands-on labs and experience new technologies in a familiar and productive development environment with Oracle experts guidance. Click on the Register Now Calgary, Alberta to register for the Calgary event and click on the Register Now Atlanta, GA to register for the Atlanta FREE events. Don't miss your exclusive opportunity to network with your peers and discuss today's most vital application development topics with Oracle experts.

    Read the article

  • Export data to Excel from Silverlight/WPF DataGrid

    - by outcoldman
    Data export from DataGrid to Excel is very common task, and it can be solved with different ways, and chosen way depend on kind of app which you are design. If you are developing app for enterprise, and it will be installed on several computes, then you can to advance a claim (system requirements) with which your app will be work for client. Or customer will advance system requirements on which your app should work. In this case you can use COM for export (use infrastructure of Excel or OpenOffice). This approach will give you much more flexibility and give you possibility to use all features of Excel app. About this approach I’ll speak below. Other way – your app is for personal use, it can be installed on any home computer, in this case it is not good to ask user to install MS Office or OpenOffice just for using your app. In this way you can use foreign tools for export, or export to xml/html format which MS Office can read (this approach used by JIRA). But in this case will be more difficult to satisfy user tasks, like create document with landscape rotation and with defined fields for printing. At this article I'll show you how to work with Excel object from .NET 4 and Silverlight 4 with dynamic objects and give you an approach which allow you to export data from DataGrid Silverlight and WPF controls. Read more...

    Read the article

  • Windows 8 / IIS 8 Concurrent Requests Limit

    - by OWScott
    IIS 8 on Windows Server 2012 doesn’t have any fixed concurrent request limit, apart from whatever limit would be reached when resources are maxed. However, the client version of IIS 8, which is on Windows 8, does have a concurrent connection request limitation to limit high traffic production uses on a client edition of Windows. Starting with IIS 7 (Windows Vista), the behavior changed from previous versions.  In previous client versions of IIS, excess requests would throw a 403.9 error message (Access Forbidden: Too many users are connected.).  Instead, Windows Vista, 7 and 8 queue excessive requests so that they will be handled gracefully, although there is a maximum number of requests that will be processed simultaneously. Thomas Deml provided a concurrent request chart for Windows Vista many years ago, but I have been unable to find an equivalent chart for Windows 8 so I asked Wade Hilmo from the IIS team what the limits are.  Since this is controlled not by the IIS team itself but rather from the Windows licensing team, he asked around and found the authoritative answer, which I’ll provide below. Windows 8 – IIS 8 Concurrent Requests Limit Windows 8 3 Windows 8 Professional 10 Windows RT N/A since IIS does not run on Windows RT Windows 7 – IIS 7.5 Concurrent Requests Limit Windows 7 Home Starter 1 Windows 7 Basic 1 Windows 7 Premium 3 Windows 7 Ultimate, Professional, Enterprise 10 Windows Vista – IIS 7 Concurrent Requests Limit Windows Vista Home Basic (IIS process activation and HTTP processing only) 3 Windows Vista Home Premium 3 Windows Vista Ultimate, Professional 10 Windows Server 2003, Windows Server 2008, Windows Server 2008 R2 and Windows Server 2012 allow an unlimited amount of simultaneously requests.

    Read the article

  • CodePlex Daily Summary for Saturday, March 06, 2010

    CodePlex Daily Summary for Saturday, March 06, 2010New ProjectsAgr.CQRS: Agr.CQRS is a C# framework for DDD applications that use the Command Query Responsibility Segregation pattern (CQRS) and Event Sourcing. BigDays 2010: Big>Days 2010BizTalk - Controlled Admin: Hi .NET folks, I am planning to start project on a Controlled BizTalk Admin tool. This tool will be useful for the organizations which have "Sh...Blacklist of Providers: Blacklist of Providers - the application for department of warehouse logistics (warehouse) at firms.Career Vector: A job board software.Chargify Demo: This is a sample website for ChargifyConceptual: Concept description and animationEric Hexter: My publicly available source code and examplesFluentNHibernate.Search: A Fluent NHibernate.Search mapping interface for NHibernate provider implementation of Lucene.NET.FreelancePlanner: FreelancePlanner is a project tracking tool for freelance translators.HTMLx - JavaScript on the Server for .NET: HTMLx is a set of libraries based on ASP.NET engine to provide JavaScript programmability on the server side. It allows Web developers to use JavaS...IronMSBuild: IronMSBuild is a custom MSBuild Task, which allows you to execute IronRuby scripts. // have to provide some examples LINQ To Blippr: LINQ to Blippr is an open source LINQ Provider for the micro-reviewing service Blippr. LINQ to Blippr makes it easier and more efficent for develo...Luk@sh's HTML Parser: library that simplifies parsing of the HTML documents, for .NETMeta Choons: Unsure as yet but will be a kind of discogs type site but different..NetWork2: NetWork2Regular Expression Chooser: Simple gui for choosing the regular expressions that have become more than simple.See.Sharper: Hopefully useful C# extensions.SharePoint 2010 Toggle User Interface: Toggle the SharePoint 2010 user interface between the new SharePoint 2010 user interface and SharePoint 2007 user interface.Silverlight DiscussionBoard for SharePoint: This is a sharepoint 3.0 webpart that uses a silverlight treeview to display metadata about sharepoint discussions anduses the html bridge to show...Simple Sales Tracking CRM API Wrapper: The Simple Sales Tracking API Wrapper, enables easy extention development and integration with the hosted service at http://www.simplesalestracking...Syntax4Word: A syntax addin for word 2007.TortoiseHg installer builder: TortoiseHg and Mercurial installer builder for Windowsunbinder: Model un binding for route value dictionariesWindows Workflow Foundation on Codeplex: This site has previews of Workflow features which are released out of band for the purposes of adoption and feedback.XNA RSM Render State Manager: Render state management idea for XNA games. Enables isolation between draw calls whilst reducing DX9 SetRenderState calls to the minimum.New ReleasesAgr.CQRS: Sourcecode package: Agr.CQRS is a C# framework for DDD applications that use the Command Query Responsibility Segregation pattern (CQRS) and Event Sourcing. This dow...Book Cataloger: Preview 0.1.6a: New Features: Export to Word 2007 Bibliography format Dictionary list editors for Binding, Condition Improvements: Stability improved Content ...Braintree Client Library: Braintree-1.1.2: Includes minor enhancements to CreditCard and ValidationErrors to support upcoming example application.CassiniDev - Cassini 3.5 Developers Edition: CassiniDev v3.5.0.5: For usage see Readme.htm in download. New in CassiniDev v3.5.0.5 Reintroduced the Lib project and signed all Implemented the CassiniSqlFixture -...Composure: Calcium-64420-VS2010rc1.NET4.SL3: This is a simple conversion of Calcium (rev 64420) built in VS2010 RC1 against .NET4 and Silverlight 3. No source files were changed and ALL test...Composure: MS AJAX Library (46266) for VS2010 RC1 .NET4: This is a quick port of Microsoft's AJAX Library (rev 46266) for Visual Studio 2010 RC1 built against .NET 4.0. Since this conversion was thrown t...Composure: MS Web Test Lightweight for VS2010 RC1 .NET4: A simple conversion of Microsoft's Web Test Lightweight for Visual Studio 2010 RC1 .NET 4.0. This is part of a larger "special request" conversion...CoNatural Components: CoNatural Components 1.5: Supporting new data types: Added support for binary data types -> binary, varbinary, etc maps to byte[] Now supporting SQL Server 2008 new types ...Extensia: Extensia 2010-03-05: Extensia is a very large list of extension methods and a few helper types. Some extension methods are not practical (e.g. slow) whilst others are....Fluent Assertions: Fluent Assertions release 1.1: In this release, we've worked hard to add some important missing features that we really needed, and also improve resiliance against illegal argume...Fluent Ribbon Control Suite: Fluent Ribbon Control Suite 1.0 RC: Fluent Ribbon Control Suite 1.0 (Release Candidate)Includes: Fluent.dll (with .pdb and .xml, debug and release version) Showcase Application Sa...FluentNHibernate.Search: 0.1 Beta: First beta versionFolderSize: FolderSize.Win32.1.0.7.0: FolderSize.Win32.1.0.6.0 A simple utility intended to be used to scan harddrives for the folders that take most place and display this to the user...Free Silverlight & WPF Chart Control - Visifire: Silverlight and WPF Step Line Chart: Hi, With this release Visifire introduces Step Line Chart. This release also contains fix for the following issues: * In WPF, if AnimatedUpd...Html to OpenXml: HtmlToOpenXml 1.0: The dll library to include in your project. The dll is signed for GAC support. Compiled with .Net 3.5, Dependencies on System.Drawing.dll and Docu...Line Counter: 1.5.1: The Line Counter is a tool to calculate lines of your code files. The tool was written in .NET 2.0. Line Counter 1.5.1 Added outline icons and lin...Lokad Cloud - .NET O/C mapper (object to cloud) for Windows Azure: Lokad.Cloud v1.0.662.1: You can get the most recent release directly from the build server at http://build.lokad.com/distrib/Lokad.Cloud/Lost in Translation: LostInTranslation v0.2: Alpha release: function complete but not UX complete.MDownloader: MDownloader-0.15.7.56349: Supported large file resumption. Fixed minor bugs.Mini C# Lab: Mini CSharp Lab Ver 1.4: The primary new feature of Ver 1.4 is batch mode! Now you can run Mini C# Lab program as a scheduled task, no UI interactivity is needed. Here ar...Mobile Store: First drop: First droppatterns & practices SharePoint Guidance: SPG2010 Drop6: SharePoint Guidance Drop Notes Microsoft patterns and practices ****************************************** ***************************************...Picasa Downloader: PicasaDownloader (41446): Changelog: Replaced some exception messages by a Summary dialog shown after downloading if there have been problems. Corrected the Portable vers...Pod Thrower: Version 1: This is the first release, I'm sure there are bugs, the tool is fully functional and I'm using it currently.PowerShell Provider BizTalk: BizTalkFactory PowerShell Provider - 1.1-snapshot: This release constitutes the latest development snapshot for the Provider. Please, leave feedback and use the Issue Tracker to help improve this pr...Resharper Settings Manager: RSM 1.2.1: This is a bug fix release. Changes Fixed plug-in crash when shared settings file was modified externally.Reusable Library Demo: Reusable Library Demo v1.0.2: A demonstration of reusable abstractions for enterprise application developerSharePoint 2010 Toggle User Interface: SharePoint Toggle User Interface: Release 1.0.0.0Starter Kit Mytrip.Mvc.Entity: Mytrip.Mvc.Entity(net3.5 MySQL) 1.0 Beta: MySQL VS 2008 EF Membership UserManager FileManager Localization Captcha ClientValidation Theme CrossBrowserTortoiseHg: TortoiseHg 1.0: http://bitbucket.org/tortoisehg/stable/wiki/ReleaseNotes Please backup your user Mercurial.ini file and then uninstall any 0.9.X release before in...Visual Studio 2010 and Team Foundation Server 2010 VM Factory: Rangers Virtualization Guidance: Rangers Virtualization Guidance Focused guidance on creating a Rangers base image manually and introduction of PowerShell scripts to automate many ...Visual Studio DSite: Advanced Email Program (Visual Basic 2008): This email program can send email to any one using your email username and email credentials. The email program can also attatch attactments to you...WPF ShaderEffect Generator: WPF ShaderEffect Generator 1.6: Several improvements and bug fixes have gone into the comment parsing code for the registers. The plug-in should now correctly pay attention to th...WSDLGenerator: WSDLGenerator 0.0.0.3: - Fixed SharePoint generated *.wsdl.aspx file - Added commandline option -wsdl which does only generate the wsdl file.Most Popular ProjectsMetaSharpRawrWBFS ManagerAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitWindows Presentation Foundation (WPF)ASP.NETLiveUpload to FacebookMicrosoft SQL Server Community & SamplesMost Active ProjectsUmbraco CMSRawrSDS: Scientific DataSet library and toolsBlogEngine.NETjQuery Library for SharePoint Web Servicespatterns & practices – Enterprise LibraryIonics Isapi Rewrite FilterFluent AssertionsComposureDiffPlex - a .NET Diff Generator

    Read the article

  • SQL SERVER – Shrinking NDF and MDF Files – Readers’ Opinion

    - by pinaldave
    Previously, I had written a blog post about SQL SERVER – Shrinking NDF and MDF Files – A Safe Operation. After that, I have written the following blog post that talks about the advantage and disadvantage of Shrinking and why one should not be Shrinking a file SQL SERVER – SHRINKFILE and TRUNCATE Log File in SQL Server 2008. On this subject, SQL Server Expert Imran Mohammed left an excellent comment. I just feel that his comment is worth a big article itself. For everybody to read his wonderful explanation, I am posting this blog post here. Thanks Imran! Shrinking Database always creates performance degradation and increases fragmentation in the database. I suggest that you keep that in mind before you start reading the following comment. If you are going to say Shrinking Database is bad and evil, here I am saying it first and loud. Now, the comment of Imran is written while keeping in mind only the process showing how the Shrinking Database Operation works. Imran has already explained his understanding and requests further explanation. I have removed the Best Practices section from Imran’s comments, as there are a few corrections. Comments from Imran - Before I explain to you the concept of Shrink Database, let us understand the concept of Database Files. When we create a new database inside the SQL Server, it is typical that SQl Server creates two physical files in the Operating System: one with .MDF Extension, and another with .LDF Extension. .MDF is called as Primary Data File. .LDF is called as Transactional Log file. If you add one or more data files to a database, the physical file that will be created in the Operating System will have an extension of .NDF, which is called as Secondary Data File; whereas, when you add one or more log files to a database, the physical file that will be created in the Operating System will have the same extension as .LDF. The questions now are, “Why does a new data file have a different extension (.NDF)?”, “Why is it called as a secondary data file?” and, “Why is .MDF file called as a primary data file?” Answers: Note: The following explanation is based on my limited knowledge of SQL Server, so experts please do comment. A data file with a .MDF extension is called a Primary Data File, and the reason behind it is that it contains Database Catalogs. Catalogs mean Meta Data. Meta Data is “Data about Data”. An example for Meta Data includes system objects that store information about other objects, except the data stored by the users. sysobjects stores information about all objects in that database. sysindexes stores information about all indexes and rows of every table in that database. syscolumns stores information about all columns that each table has in that database. sysusers stores how many users that database has. Although Meta Data stores information about other objects, it is not the transactional data that a user enters; rather, it’s a system data about the data. Because Primary Data File (.MDF) contains important information about the database, it is treated as a special file. It is given the name Primary Data file because it contains the Database Catalogs. This file is present in the Primary File Group. You can always create additional objects (Tables, indexes etc.) in the Primary data file (This file is present in the Primary File group), by mentioning that you want to create this object under the Primary File Group. Any additional data file that you add to the database will have only transactional data but no Meta Data, so that’s why it is called as the Secondary Data File. It is given the extension name .NDF so that the user can easily identify whether a specific data file is a Primary Data File or a Secondary Data File(s). There are many advantages of storing data in different files that are under different file groups. You can put your read only in the tables in one file (file group) and read-write tables in another file (file group) and take a backup of only the file group that has read the write data, so that you can avoid taking the backup of a read-only data that cannot be altered. Creating additional files in different physical hard disks also improves I/O performance. A real-time scenario where we use Files could be this one: Let’s say you have created a database called MYDB in the D-Drive which has a 50 GB space. You also have 1 Database File (.MDF) and 1 Log File on D-Drive and suppose that all of that 50 GB space has been used up and you do not have any free space left but you still want to add an additional space to the database. One easy option would be to add one more physical hard disk to the server, add new data file to MYDB database and create this new data file in a new hard disk then move some of the objects from one file to another, and put the file group under which you added new file as default File group, so that any new object that is created gets into the new files, unless specified. Now that we got a basic idea of what data files are, what type of data they store and why they are named the way they are, let’s move on to the next topic, Shrinking. First of all, I disagree with the Microsoft terminology for naming this feature as “Shrinking”. Shrinking, in regular terms, means to reduce the size of a file by means of compressing it. BUT in SQL Server, Shrinking DOES NOT mean compressing. Shrinking in SQL Server means to remove an empty space from database files and release the empty space either to the Operating System or to SQL Server. Let’s examine this through an example. Let’s say you have a database “MYDB” with a size of 50 GB that has a free space of about 20 GB, which means 30GB in the database is filled with data and the 20 GB of space is free in the database because it is not currently utilized by the SQL Server (Database); it is reserved and not yet in use. If you choose to shrink the database and to release an empty space to Operating System, and MIND YOU, you can only shrink the database size to 30 GB (in our example). You cannot shrink the database to a size less than what is filled with data. So, if you have a database that is full and has no empty space in the data file and log file (you don’t have an extra disk space to set Auto growth option ON), YOU CANNOT issue the SHRINK Database/File command, because of two reasons: There is no empty space to be released because the Shrink command does not compress the database; it only removes the empty space from the database files and there is no empty space. Remember, the Shrink command is a logged operation. When we perform the Shrink operation, this information is logged in the log file. If there is no empty space in the log file, SQL Server cannot write to the log file and you cannot shrink a database. Now answering your questions: (1) Q: What are the USEDPAGES & ESTIMATEDPAGES that appear on the Results Pane after using the DBCC SHRINKDATABASE (NorthWind, 10) ? A: According to Books Online (For SQL Server 2000): UsedPages: the number of 8-KB pages currently used by the file. EstimatedPages: the number of 8-KB pages that SQL Server estimates the file could be shrunk down to. Important Note: Before asking any question, make sure you go through Books Online or search on the Google once. The reasons for doing so have many advantages: 1. If someone else already has had this question before, chances that it is already answered are more than 50 %. 2. This reduces your waiting time for the answer. (2) Q: What is the difference between Shrinking the Database using DBCC command like the one above & shrinking it from the Enterprise Manager Console by Right-Clicking the database, going to TASKS & then selecting SHRINK Option, on a SQL Server 2000 environment? A: As far as my knowledge goes, there is no difference, both will work the same way, one advantage of using this command from query analyzer is, your console won’t be freezed. You can do perform your regular activities using Enterprise Manager. (3) Q: What is this .NDF file that is discussed above? I have never heard of it. What is it used for? Is it used by end-users, DBAs or the SERVER/SYSTEM itself? A: .NDF File is a secondary data file. You never heard of it because when database is created, SQL Server creates database by default with only 1 data file (.MDF) and 1 log file (.LDF) or however your model database has been setup, because a model database is a template used every time you create a new database using the CREATE DATABASE Command. Unless you have added an extra data file, you will not see it. This file is used by the SQL Server to store data which are saved by the users. Hope this information helps. I would like to as the experts to please comment if what I understand is not what the Microsoft guys meant. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Readers Contribution, Readers Question, SQL, SQL Authority, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

< Previous Page | 168 169 170 171 172 173 174 175 176 177 178 179  | Next Page >