Search Results

Search found 42367 results on 1695 pages for 'resource files'.

Page 550/1695 | < Previous Page | 546 547 548 549 550 551 552 553 554 555 556 557  | Next Page >

  • Running an Application on a Different Domain

    - by Mark Flory
    Were I am contracting at right now has a new development domain.  Because of IT security rules it is fairly isolated from the domain my computer normally logs into (for e-mail and such).  I do use a VM to log directly into the domain but one of my co-workers found this command to run things on your box but in the other domain.  Pretty cool. For example this runs SQL Server Management Tool for SQL Server 2008: runas /netonly /user:{domain}\{username} "C:\Program Files\Microsoft SQL Server\100\Tools\Binn\VSShell\Common7\IDE\ssms.exe" And this runs visual studios: runas /netonly /user:{domain}\{username} "C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\devenv.exe" It does not solve the problem I wanted to solve which would be to be able to assign Users/Groups in Team Explorer.  It instead still uses the domain I am logged into's groups.

    Read the article

  • Automatically Reset Theme To Default, SharePoint 2010

    - by KunaalKapoor
    Manually/Through UIOn the top link bar, click Site Settings.On the Site Management page, in the Customization section, click Apply theme to site.On the Apply Theme to Web Site page, select No Theme(Default) from the list.Click Apply.Through Scriptfunction Apply-SPDefaultTheme([string]$SiteUrl, [string]$webName){$site = new-object Microsoft.SharePoint.SPSite($SiteUrl)$web = $site.OpenWeb($webName)$theme = [Microsoft.SharePoint.Utilities.ThmxTheme]::RemoveThemeFromWeb($web,$false)$web.Update()$web.Dispose()$site.Dispose()}After looking in the SPTHEMES.XML file found in the C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\TEMPLATE\LAYOUTS\1033 folder, you do see there is a theme with a theme name of "none". Since there is no "default" theme in 2010. So make sure if you wanna reset it to default you know that there is no default, you need to select 'none' :)

    Read the article

  • How To Remove Authorized PCs from Your Windows Store Account

    - by Taylor Gibb
    One of the awesome things about the Windows Store is you are allowed to install any app you purchase on up to 5 Windows machines. This means that the PC you install the app on gets added to your Trusted PC list. Here’s how to clean up that list. Why Does 64-Bit Windows Need a Separate “Program Files (x86)” Folder? Why Your Android Phone Isn’t Getting Operating System Updates and What You Can Do About It How To Delete, Move, or Rename Locked Files in Windows

    Read the article

  • How to automatically mount a Windows shared folder on every boot up?

    - by Zabba
    I am able to access Windows' shared folder from Ubuntu 10.10 Nautilus like so: Type into the Location Bar : smb://box/projects Now, I can see the folder in Nautilus, create/read files in it. Also, on desktop I get a folder called "projects on box". But, that folder on the desktop goes away when I reboot. So, I thought that I can automount the Windows' shared projects folder by adding this to my fstab: //box/Projects /home/base/Projects smbfs rw,user,username=jack,password=www222,fmask=666,dmask=777 0 0 (base is my user name on Ubuntu) Now, I get a folder called "Projects" in my home folder after boot up, but it is empty (cannot see the same files that I can see in Nautilus). What's am I doing wrong? Some more detail: This is what I see of the Projects folder when I do ls -l in my home folder: ... drwxr-xr-x 2 root root 4096 2011-01-01 10:22 Projects drwxr-xr-x 2 base base 4096 2011-01-01 09:06 Public ... Note the two "roots". Is that somehow the problem?

    Read the article

  • Free libraries to work with Excel

    - by Danil Gholtsman
    I got some excel files, I need to read data from it and upload data to some database (I need to use firebird, but whatever). Right now I use <QAxObject> from Qt and code look like QAxObject* excel = new QAxObject("Excel.Application"); //pointer to excel //excel->setProperty("Visible", false); QAxObject* workbooks = excel->querySubObject("WorkBooks"); //get pointer to booklist workbooks->dynamicCall("Open (const QString&)", QString("C:\\databases\\test.xls")); //opening file, getting pointer to booklist QAxObject* workbook = excel->querySubObject("ActiveWorkBook"); QAxObject* worksheets = workbook->querySubObject("WorkSheets"); etc. The problem is that this way on users PC there must be installed Excel. Is there exists some free C++ libraries to work with *.xls, *.xlsx files without Excel installed?

    Read the article

  • Data architecture for event log metrics?

    - by elliot42
    My service has a large ongoing number of user events, and we would like to do things like "count occurrence of event type T since date D." We are trying to make two basic decisions: What to store? Storing every event vs. only storing aggregates (Event log style) log every event and count them later, vs. (Time-series style) store a single aggregated "count of event E for date D" for every day Where to store the data In a relational database (particularly MySQL) In a non-relational (NoSQL) database In flat log files (collected centrally over the network via syslog-ng) What is standard practice / where can I read more about comparing the different types of systems? Additional details: The total event stream is large, potentially hundreds of thousands of entries per day But our current need is only to count certain types of events within it We don't necessarily need real-time access to the raw data or aggregation results IMHO, "log all events to files, crawl them at a later time to filter and aggregate the stream" is a pretty standard UNIX Way, but my Rails-y compatriots seem to think that nothing is real unless it's in MySQL.

    Read the article

  • Links in my site have been hacked

    - by Funky
    In my site I prefix the images and links with the domain of the site for better SEO using the code below: public static string GetHTTPHost() { string host = ""; if (HttpContext.Current.Request["HTTP_HOST"] != null) host = HttpContext.Current.Request["HTTP_HOST"]; if (host == "site.co.uk" || host == "site.com") { return "http://www." + host; } return "http://"+ host; } This works great, but for some reason, lots of links have now changed to http://www.baidu.com/... There is no sign of this in any of the code or project, the files on the server also have a change date when i last did the publish at 11 yesterday, so all the files on there look fine. I am using ASP.net and Umbraco 4.7.2 Does anyone have any ideas? thanks

    Read the article

  • How do I put different textures on different walls? LWJGL

    - by lehermj
    So far I have it so you are running around in a box, but all of the walls are the same texture! I've loaded up other textures for the walls (I want the walls a different texture than the floor) but it seems as if its being ignored... Here's my code: int floorTexture = glGenTextures(); { InputStream in = null; try { in = new FileInputStream("floor.png"); PNGDecoder decoder = new PNGDecoder(in); ByteBuffer buffer = BufferUtils.createByteBuffer(4 * decoder.getWidth() * decoder.getHeight()); decoder.decode(buffer, decoder.getWidth() * 4, Format.RGBA); buffer.flip(); glBindTexture(GL_TEXTURE_2D, floorTexture); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, decoder.getWidth(), decoder.getHeight(), 0, GL_RGBA, GL_UNSIGNED_BYTE, buffer); glBindTexture(GL_TEXTURE_2D, floorTexture); } catch (FileNotFoundException ex) { System.err.println("Failed to find the texture files."); ex.printStackTrace(); Display.destroy(); System.exit(1); } catch (IOException ex) { System.err.println("Failed to load the texture files."); ex.printStackTrace(); Display.destroy(); System.exit(1); } finally { if (in != null) { try { in.close(); } catch (IOException e) { e.printStackTrace(); } } } } int wallTexture = glGenTextures(); { InputStream in = null; try { in = new FileInputStream("walls.png"); PNGDecoder decoder = new PNGDecoder(in); ByteBuffer buffer = BufferUtils.createByteBuffer(4 * decoder.getWidth() * decoder.getHeight()); decoder.decode(buffer, decoder.getWidth() * 4, Format.RGBA); buffer.flip(); glBindTexture(GL_TEXTURE_2D, wallTexture); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, decoder.getWidth(), decoder.getHeight(), 0, GL_RGBA, GL_UNSIGNED_BYTE, buffer); glBindTexture(GL_TEXTURE_2D, wallTexture); } catch (FileNotFoundException ex) { System.err.println("Failed to find the texture files."); ex.printStackTrace(); Display.destroy(); System.exit(1); } catch (IOException ex) { System.err.println("Failed to load the texture files."); ex.printStackTrace(); Display.destroy(); System.exit(1); } finally { if (in != null) { try { in.close(); } catch (IOException e) { e.printStackTrace(); } } } } int ceilingDisplayList = glGenLists(1); glNewList(ceilingDisplayList, GL_COMPILE); glBegin(GL_QUADS); glTexCoord2f(0, 0); glVertex3f(-gridSize, ceilingHeight, -gridSize); glTexCoord2f(gridSize * 10 * tileSize, 0); glVertex3f(gridSize, ceilingHeight, -gridSize); glTexCoord2f(gridSize * 10 * tileSize, gridSize * 10 * tileSize); glVertex3f(gridSize, ceilingHeight, gridSize); glTexCoord2f(0, gridSize * 10 * tileSize); glVertex3f(-gridSize, ceilingHeight, gridSize); glEnd(); glEndList(); int wallDisplayList = glGenLists(1); glNewList(wallDisplayList, GL_COMPILE); glBegin(GL_QUADS); // North wall glTexCoord2f(0, 0); glVertex3f(-gridSize, floorHeight, -gridSize); glTexCoord2f(0, gridSize * 10 * tileSize); glVertex3f(gridSize, floorHeight, -gridSize); glTexCoord2f(gridSize * 10 * tileSize, gridSize * 10 * tileSize); glVertex3f(gridSize, ceilingHeight, -gridSize); glTexCoord2f(gridSize * 10 * tileSize, 0); glVertex3f(-gridSize, ceilingHeight, -gridSize); // West wall glTexCoord2f(0, 0); glVertex3f(-gridSize, floorHeight, -gridSize); glTexCoord2f(gridSize * 10 * tileSize, 0); glVertex3f(-gridSize, ceilingHeight, -gridSize); glTexCoord2f(gridSize * 10 * tileSize, gridSize * 10 * tileSize); glVertex3f(-gridSize, ceilingHeight, +gridSize); glTexCoord2f(0, gridSize * 10 * tileSize); glVertex3f(-gridSize, floorHeight, +gridSize); // East wall glTexCoord2f(0, 0); glVertex3f(+gridSize, floorHeight, -gridSize); glTexCoord2f(gridSize * 10 * tileSize, 0); glVertex3f(+gridSize, floorHeight, +gridSize); glTexCoord2f(gridSize * 10 * tileSize, gridSize * 10 * tileSize); glVertex3f(+gridSize, ceilingHeight, +gridSize); glTexCoord2f(0, gridSize * 10 * tileSize); glVertex3f(+gridSize, ceilingHeight, -gridSize); // South wall glTexCoord2f(0, 0); glVertex3f(-gridSize, floorHeight, +gridSize); glTexCoord2f(gridSize * 10 * tileSize, 0); glVertex3f(-gridSize, ceilingHeight, +gridSize); glTexCoord2f(gridSize * 10 * tileSize, gridSize * 10 * tileSize); glVertex3f(+gridSize, ceilingHeight, +gridSize); glTexCoord2f(0, gridSize * 10 * tileSize); glVertex3f(+gridSize, floorHeight, +gridSize); glEnd(); glEndList(); int floorDisplayList = glGenLists(1); glNewList(floorDisplayList, GL_COMPILE); glBegin(GL_QUADS); glTexCoord2f(0, 0); glVertex3f(-gridSize, floorHeight, -gridSize); glTexCoord2f(0, gridSize * 10 * tileSize); glVertex3f(-gridSize, floorHeight, gridSize); glTexCoord2f(gridSize * 10 * tileSize, gridSize * 10 * tileSize); glVertex3f(gridSize, floorHeight, gridSize); glTexCoord2f(gridSize * 10 * tileSize, 0); glVertex3f(gridSize, floorHeight, -gridSize); glEnd(); glEndList();

    Read the article

  • Where are my date ranges in Analytics coming from?

    - by Jeffrey McDaniel
    In the P6 Reporting Database there are two main tables to consider when viewing time - W_DAY_D and W_Calendar_FS.  W_DAY_D is populated internally during the ETL process and will provide a row for every day in the given time range. Each row will contain aspects of that day such as calendar year, month, week, quarter, etc. to allow it to be used in the time element when creating requests in Analytics to group data into these time granularities. W_Calendar_FS is used for calculations such as spreads, but is also based on the same set date range. The min and max day_dt (W_DAY_D) and daydate (W_Calendar_FS) will be related to the date range defined, which is a start date and a rolling interval plus a certain range. Generally start date plus 3 years.  In P6 Reporting Database 2.0 this date range was defined in the Configuration utility.  As of P6 Reporting Database 3.0, with the introduction of the Extended Schema this date range is set in the P6 web application. The Extended Schema uses this date range to calculate the data for near real time reporting in P6.  This same date range is validated and used for the P6 Reporting Database.  The rolling date range means if today is April 1, 2010 and the rolling interval is set to three years, the min date will be 1/1/2010 and the max date will be 4/1/2013.  1/1/2010 will be the min date because we always back fill to the beginning of the year. On April 2nd, the Extended schema services are run and the date range is adjusted there to move the max date forward to 4/2/2013.  When the ETL process is run the Reporting Database will pick up this change and also adjust the max date on the W_DAY_D and W_Calendar_FS. There are scenarios where date ranges affecting areas like resource limit may not be adjusted until a change occurs to cause a recalculation, but based on general system usage these dates in these tables will progress forward with the rolling intervals. Choosing a large date range can have an effect on the ETL process for the P6 Reporting Database. The extract portion of the process will pull spread data over into the STAR. The date range defines how long activity and resource assignment spread data is spread out in these tables. If an activity lasts 5 days it will have 5 days of spread data. If a project lasts 5 years, and the date range is 3 years the spread data after that 3 year date range will be bucketed into the last day in the date range. For the overall project and even the activity level you will still see the correct total values.  You just would not be able to see the daily spread 5 years from now. This is an important question when choosing your date range, do you really need to see spread data down to the day 5 years in the future?  Generally this amount of granularity years in the future is not needed. Remember all those values 5, 10, 15, 20 years in the future are still available to report on they would be in more of a summary format on the activity or project.  The data is always there, the level of granularity is the decision.

    Read the article

  • How to direct a Network Solutions domain name to an html website hosted on Google Drive? [on hold]

    - by Air Conditioner
    To begin with, I'd wanted to take advantage of HTML, CSS, and so on to build a website that looks and works just as I'd like it to. I took a look around on how I could make that work, and I soon saw a lifehacker article showing that its possible to host website files on google drive. I then made sure that the folder containing the files was shared publicly throughout the web, and I now have a working 'google drive hosted' domain for the website. However, I did want to have the custom domain, and so I registered one with network solutions. So now, I'm curious on how I should direct my Network Solutions domain to the index.html I'm hosting on google drive. Would anyone have an Idea?

    Read the article

  • Given two sets of DNA, what does it take to computationally "grow" that person from a fertilised egg and see what they become? [closed]

    - by Nicholas Hill
    My question is essentially entirely in the title, but let me add some points to prevent some "why on earth would you want to do that" sort of answers: This is more of a mind experiment than an attempt to implement real software. For fun. Don't worry about computational speed or the number of available memory bytes. Computers get faster and better all of the time. Imagine we have two data files: Mother.dna and Father.dna. What else would be required? (Bonus point for someone who tells me approx how many GB each file will be, and if the size of the files are exactly the same number of bytes for everyone alive on Earth!) There would ideally need to be a way to see what the egg becomes as it becomes a human adult. If you fancy, feel free to outline the design. I am initially thinking that there'd need to be some sort of volumetric voxel-based 3D environment for simulation purposes.

    Read the article

  • How to include content from remote server while keeping that content secure

    - by slayton
    I am hosting collection of videos, for which I retain the copyright, on a file server that I'd like to share with family and friends. When a user visits the my fileserver via a web browser they are asked to authenticate using HTTP auth and then they are presented with a basic list of the files. I'd like to build web application that provides a clean interface with simply library functionality. However, this app will be hosted on a different server. I'm trying to figure out a security model for my file server that doesn't require the user to login to both the file-server and the hosting-server. I want to make this as easy as possible for my non-tech savy family while still maintaining security for my files.

    Read the article

  • Upload image file: is compression on client side already possible?

    - by Chris
    When offering photo file uploading, usually the user will have badly compressed and huge (10+ megapixels) JPEG files from their cameras or phones. On the server side, these files will get re-compressed to something like 800x600px and JPEG quality 7 or 8. Is it (already) possible to do that re-compression on the client side? So that I would only need to transmit some 100kB (800x600px) and not 3 MB or more. Something like: (1) With javascript's new FileSystem API ( http://slides.html5rocks.com/#filewriter ) it would be possible to read the photo file's data into client side JS. (2) Then it would be necessary to re-encode the JPEG data, which is possible, but I counld not find any library for that (yet). Anybody knows such a library? (3) Last step would be to POST the re-compressed JPEG data to the server side for storage and get a URL to the stored photo file back from the server for inclusion into the client's HTML. I am looking for some jQuery plugin, other JS library or example web page that does this.

    Read the article

  • Get script for every action in SQL Server Management Studio

    I am always conscious to keep a record of all operations performed on my database servers. Operations through T-SQL in an SSMS query pane can easily be saved in query files. For table modifications through SSMS designer I have predefined setting to generate T-SQL scripts. However there are numerous database and server level tasks that I use the SSMS GUI and I would like to have a script of these changes for later reference. Examples of such actions through the SSMS GUI are backup/restore, changing compatibility level of a database, manipulating permissions, dealing with database or log files or creating/manipulating any login/user. I am looking for any way to generate T-SQL code for such actions, so that it may be kept for later reference

    Read the article

  • MySQL, An Ideal Choice for The Cloud

    - by Bertrand Matthelié
    As the world's most popular web database, MySQL has quickly become the leading database for the cloud, with most providers offering MySQL-based services. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Access our Resource Kit to discover: Why MySQL has become the leading database in the cloud, and how it addresses the critical attributes of cloud-based deployments How ISVs rely on MySQL to power their SaaS offerings Best practices to deploy the world’s most popular open source database in public and private clouds Normal 0 false false false EN-US X-NONE X-NONE You will also find out how you can leverage MySQL together with Hadoop and other technologies to unlock the value of Big Data, either on-premise or in the cloud. Access white papers, webinars, case studies and other resources in /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} our Resource Kit now!

    Read the article

  • disk not accessible

    - by user107044
    i formatted my hard drive yesterday and it was working well even after the formatting. But when I restarted my system again , is is showing that the space is alloted to my files but they are inaccessible. I have even tried to unhide the files and folders, if they got hidden somehow. But nothing works. the hard drive is being shown empty but the properties are saying that it still conatins the data : http://imgur.com/ObjTE in the image, it is showing that the directory has only 1 file of size:4.8 kbps but the space being used by the drive is 11.6 GB. do suggest some solution.

    Read the article

  • What Part of Your Project Should be in Source Code Control?

    - by muffinista
    A fellow developer has started work on a new Drupal project, and the sysadmin has suggested that they should only put the sites/default subdirectory in source control, because it "will make updates easily scriptable." Setting aside that somewhat dubious claim, it raises another question -- what files should be under source control? And is there a situation where some large chunk of files should be excluded? My opinion is that the entire tree for the project should be under control, and this would be true for a Drupal project, rails, or anything else. This seems like a no-brainer -- you clearly need versioning for your framework as much as you do for any custom code you write. That said, I would love to get other opinions on this. Are there any arguments for not having everything under control? Is this sysadmin a BOFH?

    Read the article

  • What are the best Small Business Servers/Storage

    - by nasty
    I am a web designer/developer. Work mostly with large files 50mb+. I currently have a MyBook Live which is connected wirelessly to my MacBook Pro and Dell desktop PC running windows. Since its connected wirelessly(it doesnt have ethernet port) the files are loading slow and its hard to work staright away from my server. Now im looking for better storage/server solutions and looking at dell.com.au/servers. Im not sure which one to choose. Can you guys give me some suggestions whether I should buy the AUD599 or should I upgrade. Is there any stack guys haveing the same issue as me? t

    Read the article

  • Slow writing HDD speed, Ubuntu 12.04 64-bit, Thinkpad T520i

    - by pyc
    It seems that (but I'm not completely sure), that when I'm copying files from gigabit network to HDD, I can't use full potential of the network which in my case is about 60 MB/s, because HDD writing is so slow like lower than 10 MB/s, and also it's slowing down the whole system which becomes pretty much unresponsive, almost impossible to work with. Copying files to samba share residing at Ubuntu machine, connected to share from Windows 7, I'm completely sure my network equipment is OK, and there's no CPU intensive process on Ubuntu except smbd getting about 10-20% from time to time which I think is OK. Something here is burried deep I think, maybe even in kernel. Already tried to switch from AHCI to compatibility mode, and turning acpi on and off - nothing helped. So it's like HDD buffer is full and emptying slowly while machine is sluggish, load is about 3 to 4. Somebody experienced the similar problems? Some help on troubleshooting process and identifying the cause would be helpful too :) Thanks!

    Read the article

  • System checks for disk drive error every time it boots

    - by Starx
    When my disk space for the ubuntu installation partition was getting low, from a live cd, I used gparted to increase its volume capacity, but deleting another partition and merging it to the ubuntu partition. Since then onwards, I am receiving disk checking for errors at boot screen for my partitions, always. What seem to be causing this and how to fix it? Update Here is my boot.log if it provides few insight fsck from util-linux 2.19.1 fsck from util-linux 2.19.1 /dev/sda1 was not cleanly unmounted, check forced. ubuntu: clean, 501325/1310720 files, 2958455/5242880 blocks /dev/sda1: 241/51272 files (3.3% non-contiguous), 73541/102400 blocks mountall: fsck /boot [358] terminated with status 1 Skipping profile in /etc/apparmor.d/disable: usr.bin.firefox ... /dev/sda1 is a separate grub partition for my dual OS's

    Read the article

  • How does thumbnail preview in Ubuntu differ from that of Windows? [closed]

    - by Forbidden Overseer
    Possible Duplicate: How does Ubuntu know what file type a file without extension is I thought this question might get a better response in AskUbuntu, as it seems to have more to do with Ubuntu than Windows at a glance. Let's say I have a foo.mkv file. Thumbnail previews work in both Windows 7 and Ubuntu. When I change the filename to anything random like foo.bar or when I remove the extension itself (making it just foo), Nautilus shows thumbnails normally like if it can recognize what type of files they are - without looking at file extension. This however, doesn't happen in Windows 7. Windows starts asking me things like which application I want to use to open that file as soon as I remove file extension (forget thumbnails...) etc. So, How does this thumbnail preview work in Windows 7 and Ubuntu? What makes Ubuntu recognize files "out of the box" unlike Windows 7?

    Read the article

  • Xubuntu: how do I automatically mount external NTFS drive with writes allowed?

    - by user74372
    i would have thought mine was such a common question that there would be a simple solution already built in to xubuntu - but there isnt. i have 2 separate external hard disks and connect them to a usb port at different times - i would like them to be automatically mounted as read/write, but apparently the designers of gnome-volume-manager decided that shouldnt be possible and they are mounted as read-only in fact, i can write new files to them, but cannot then delete the new files i just wrote! is there a workaround? i read somewhere that etc/fstab doesn't apply to removable media, which are mounted by gnome-volume-manager and therefore cannot be unmounted by a user

    Read the article

  • Alternatives to sql like databases

    - by user613326
    Well i was wondering these days computers usually have 2GB or 4GB memory I like to use some secure client server model, and well an sql database is likely candidate. On the other hand i only have about 8000 records, who will not frequently be read or written in total they would consume less then 16 Megabyte. And it made me wonder what would be good secure options in a windows environment to store the data work with it multi-client single server model, without using SQL or mysql Would for well such a small amount of data maybe other ideas better ? Because i like to keep maintenance as simple as possible (no administrators would need to know sql maintenance, as they dont know databases in my target environment) Maybe storing in xml files or.. something else. Just wonder how others would go if ease of administration is the main goal. Oh and it should be secure to, the client server data must be a bit secure (maybe NTLM files shares https or...etc)

    Read the article

< Previous Page | 546 547 548 549 550 551 552 553 554 555 556 557  | Next Page >