Search Results

Search found 389 results on 16 pages for 'dword'.

Page 13/16 | < Previous Page | 9 10 11 12 13 14 15 16  | Next Page >

  • TAPI lineGetAddressID() fails with LINEERR_INVALADDRESS

    - by PaulH
    I have a windows mobile 6 application using TAPI 2.0. lineGetAddressID() is needed to get the address identifier used by several calls in the telephone api, but I can't get it to work. I have tried the following to no avail: HLINE line; // valid handle from lineOpen(); DWORD addr_id = 0; result = ::lineGetAddressID( line, &addr_id, LINEADDRESSMODE_DIALABLEADDR, L"1234", 5 ); result = ::lineGetAddressID( line, &addr_id, LINEADDRESSMODE_DIALABLEADDR, L"5551234", 8 ); result = ::lineGetAddressID( line, &addr_id, LINEADDRESSMODE_DIALABLEADDR, L"1115551234", 11 ); result = ::lineGetAddressID( line, &addr_id, LINEADDRESSMODE_DIALABLEADDR, L"11115551234", 12 ); All of them return LINEERR_INVALADDRESS. Can anybody point out what I may be doing wrong? As a side question, how can I programmaticly get the address? It appears in the LINEADDRESSCAPS structure returned by lineGetAddressCaps(), but that requires an address identifier (which would need to come from lineGetAddressID(), which requires an address...). Note: I realize I could use 0 as the address ID and it will probably work, but I have no guarantee it will work for every platform. I would like to get this solved 'right'. Thanks, PaulH

    Read the article

  • Read a buffer of unknown size (Console input)

    - by Sanarothe
    Hi. I'm a little behind in my X86 Asm class, and the book is making me want to shoot myself in the face. The examples in the book are insufficient and, honestly, very frustrating because of their massive dependencies upon the author's link library, which I hate. I wanted to learn ASM, not how to call his freaking library, which calls more of his library. Anyway, I'm stuck on a lab that requires console input and output. So far, I've got this for my input: input PROC INVOKE ReadConsole, inputHandle, ADDR buffer, Buf - 2, ADDR bytesRead, 0 mov eax,OFFSET buffer Ret input EndP I need to use the input and output procedures multiple times, so I'm trying to make it abstract. I'm just not sure how to use the data that is set to eax here. My initial idea was to take that string array and manually crawl through it by adding 8 to the offset for each possible digit (Input is integer, and there's a little bit of processing) but this doesn't work out because I don't know how big the input actually is. So, how would you swap the string array into an integer that could be used? Full code: (Haven't done the integer logic or the instruction string output because I'm stuck here.) include c:/irvine/irvine32.inc .data inputHandle HANDLE ? outputHandle HANDLE ? buffer BYTE BufSize DUP(?),0,0 bytesRead DWORD ? str1 BYTE "Enter an integer:",0Dh, 0Ah str2 BYTE "Enter another integer:",0Dh, 0Ah str3 BYTE "The higher of the two integers is: " int1 WORD ? int2 WORD ? int3 WORD ? Buf = 80 .code main PROC call handle push str1 call output call input push str2 call output call input push str3 call output call input main EndP larger PROC Ret larger EndP output PROC INVOKE WriteConsole Ret output EndP handle PROC USES eax INVOKE GetStdHandle, STD_INPUT_HANDLE mov inputHandle,eax INVOKE GetStdHandle, STD_INPUT_HANDLE mov outputHandle,eax Ret handle EndP input PROC INVOKE ReadConsole, inputHandle, ADDR buffer, Buf - 2, ADDR bytesRead, 0 mov eax,OFFSET buffer Ret input EndP END main

    Read the article

  • E_ACCESSDENIED on CoCreateInstance

    - by vucetica
    Here is a code snippet #include "stdafx.h" #include <tchar.h> #include <windows.h> #include <dshow.h> #include <ExDisp.h> int _tmain(int argc, _TCHAR* argv[]) { CoInitialize(NULL); HRESULT hr = S_OK; DWORD err = 0; // Try to create graph builder IGraphBuilder* pGraph = 0; hr = CoCreateInstance(CLSID_FilterGraph, NULL, CLSCTX_INPROC_SERVER, IID_IGraphBuilder, (void**)&pGraph ); err = GetLastError(); // Here, hr is E_ACCESSDENIED // err is 5 (ERROR_ACCESS_DENIED) // Try to create capture graph builder (succeeds) ICaptureGraphBuilder2* pBuild = 0; hr = CoCreateInstance(CLSID_CaptureGraphBuilder2, NULL, CLSCTX_INPROC_SERVER, IID_ICaptureGraphBuilder2, (void **)&pBuild ); err = GetLastError(); // Here, hr is S_OK // err is 0 (ERROR_SUCCESS) // Try to create IWebBrowser (succeeds) IWebBrowser2* pBrowser = 0; hr = CoCreateInstance (CLSID_InternetExplorer, NULL, CLSCTX_LOCAL_SERVER, IID_IWebBrowser2, (LPVOID *)&pBrowser); err = GetLastError(); // Here, hr is S_OK // err is 0 (ERROR_SUCCESS) return 0; } I'm trying to create IFilterGraph, which fails with E_ACCESSDENIED. On the other hand, creating other directshow objects works ok. The same with some other COM objects (tried with IWebBrowser2 as an example). Any idea what can be the problem? Thanks!

    Read the article

  • Multithreaded IOCP Client Issue

    - by Carl
    I am writing a multithreaded client that uses an IO Completion Port. I create and connect the socket that has the WSA_FLAG_OVERLAPPED attribute set. if ((m_socket = socket(AF_INET, SOCK_STREAM, IPPROTO_TCP)) == INVALID_SOCKET) { throw std::exception("Failed to create socket."); } if (WSAConnectByName(m_socket, L"server.com", L"80", &localAddressLength, reinterpret_cast<sockaddr*>(&localAddress), &remoteAddressLength, &remoteAddress, NULL, NULL) == FALSE) { throw std::exception("Failed to connect."); } I associate the IO Completion Port with the socket. if ((m_hIOCP = CreateIoCompletionPort(reinterpret_cast<HANDLE>(m_socket), m_hIOCP, NULL, 8)) == NULL) { throw std::exception("Failed to create IOCP object."); } All appears to go well until I try to send some data over the socket. SocketData* socketData = new SocketData; socketData->hEvent = 0; DWORD bytesSent = 0; if (WSASend(m_socket, socketData->SetBuffer(socketData->GenerateLoginRequestHeader()), 1, &bytesSent, NULL, reinterpret_cast<OVERLAPPED*>(socketData), NULL) == SOCKET_ERROR && WSAGetLastError() != WSA_IO_PENDING) { throw std::exception("Failed to send data."); } Instead of returning SOCKET_ERROR with the last error set to WSA_IO_PENDING, WSASend returns immediately. I need the IO to pend and for it's completion to be handled in my thread function which is also my worker thread. unsigned int __stdcall MyClass::WorkerThread(void* lpThis) { } I've done this before but I don't know what is going wrong in this case, I'd greatly appreciate any efforts in helping me fix this problem.

    Read the article

  • ORGetValue from Offline Registry - ERROR_MORE_DATA

    - by user314749
    I am trying to create an offline registry in memory using the offreg.dll provided in the windows ddk 7 package. You can find out more information on the offreg.dll here: MSDN Currently, while attempting to read a value from an open registry hive / key I receive the following error: 234 or ERROR_MORE_DATA Here is the .h code that contains ORGetValue: DWORD ORAPI ORGetValue ( __in ORHKEY Handle, __in_opt PCWSTR lpSubKey, __in_opt PCWSTR lpValue, __out_opt PDWORD pdwType, __out_bcount_opt(*pcbData) PVOID pvData, __inout_opt PDWORD pcbData ); Here is the code that I am using to pull the data [DllImport("offreg.dll", CharSet = CharSet.Auto, EntryPoint = "ORGetValue", SetLastError = true, CallingConvention = CallingConvention.StdCall)] public static extern uint ORGetValue(IntPtr Handle, string lpSubKey, string lpValue, out uint pdwType, out string pvData, out uint pcbData); IntPtr myHive; IntPtr myKey; string myValue; uint pdwtype; uint pcbdata; uint ret3 = ORGetValue(myKey, "", "DefaultUserName", out pdwtype, out myValue, out pcbdata); The goal is to be able to read myValue as a string. I am not sure if I need to use marshaling... or a second call with an adjusted buffer.. Or really how to adjust the buffer in C#. Any help or pointers would be greatly appreciated. Thank you.

    Read the article

  • Microsoft Detours - DetourUpdateThread?

    - by pault543
    Hi, I have a few quick questions about the Microsoft Detours Library. I have used it before (successfully), but I just had a thought about this function: LONG DetourUpdateThread(HANDLE hThread); I read elsewhere that this function will actually suspend the thread until the transaction completes. This seems odd since most sample code calls: DetourUpdateThread(GetCurrentThread()); Anyway, apparently this function "enlists" threads so that, when the transaction commits (and the detours are made), their instruction pointers are modified if they lie "within the rewritten code in either the target function or the trampoline function." My questions are: When the transaction commits, is the current thread's instruction pointer going to be within the DetourTransactionCommit function? If so, why should we bother enlisting it to be updated? Also, if the enlisted threads are suspended, how can the current thread continue executing (given that most sample code calls DetourUpdateThread(GetCurrentThread());)? Finally, could you suspend all threads for the current process, avoiding race conditions (considering that threads could be getting created and destroyed at any time)? Perhaps this is done when the transaction begins? This would allow us to enumerate threads more safely (as it seems less likely that new threads could be created), although what about CreateRemoteThread()? Thanks, Paul For reference, here is an extract from the simple sample: // DllMain function attaches and detaches the TimedSleep detour to the // Sleep target function. The Sleep target function is referred to // through the TrueSleep target pointer. BOOL WINAPI DllMain(HINSTANCE hinst, DWORD dwReason, LPVOID reserved) { if (dwReason == DLL_PROCESS_ATTACH) { DetourTransactionBegin(); DetourUpdateThread(GetCurrentThread()); DetourAttach(&(PVOID&)TrueSleep, TimedSleep); DetourTransactionCommit(); } else if (dwReason == DLL_PROCESS_DETACH) { DetourTransactionBegin(); DetourUpdateThread(GetCurrentThread()); DetourDetach(&(PVOID&)TrueSleep, TimedSleep); DetourTransactionCommit(); } return TRUE; }

    Read the article

  • Does using ReadDirectoryChangesW require administrator rights?

    - by Alex Jenter
    The MSDN says that using ReadDirectoryChangesW implies the calling process having the Backup and Restore priviliges. Does this mean that only process launched under administrator account will work correctly? I've tried the following code, it fails to enable the required privileges when running as a restricted user. void enablePrivileges() { enablePrivilege(SE_BACKUP_NAME); enablePrivilege(SE_RESTORE_NAME); } void enablePrivilege(LPCTSTR name) { HANDLE hToken; DWORD status; if (::OpenProcessToken(::GetCurrentProcess(), TOKEN_ADJUST_PRIVILEGES, &hToken)) { TOKEN_PRIVILEGES tp = { 1 }; if( ::LookupPrivilegeValue(NULL, name, &tp.Privileges[0].Luid) ) { tp.Privileges[0].Attributes = SE_PRIVILEGE_ENABLED; BOOL result = ::AdjustTokenPrivileges(hToken, FALSE, &tp, 0, NULL, NULL); verify (result != FALSE); status = ::GetLastError(); } ::CloseHandle(hToken); } } Am I doing something wrong? Is there any workaround for using ReadDirectoryChangesW from a non-administrator user account? It seems that the .NET's FileSystemWatcher can do this. Thanks!

    Read the article

  • Unable to change the system zone setting on Windows Server 2008 R2.

    - by Ganesh
    Hi All, I have an MFC application that tries to change the system zone setting on the Windows Server 2008 R2. I am using the SetTimeZoneInformation() API which fails with the error code 1314 .i.e. “A required privilege is not held by the client.”. Please refer the sample code below: TIME_ZONE_INFORMATION l_TimeZoneInfo; DWORD l_dwRetVal = 0; ZeroMemory(&l_TimeZoneInfo, sizeof(TIME_ZONE_INFORMATION)); l_TimeZoneInfo.Bias = -330; l_TimeZoneInfo.StandardBias = 0; l_TimeZoneInfo.StandardDate.wDay = 0; l_TimeZoneInfo.StandardDate.wDayOfWeek = 0; l_TimeZoneInfo.StandardDate.wHour = 0; l_TimeZoneInfo.StandardDate.wMilliseconds = 0; l_TimeZoneInfo.StandardDate.wMinute = 0; l_TimeZoneInfo.StandardDate.wMonth = 0; l_TimeZoneInfo.StandardDate.wSecond = 0; l_TimeZoneInfo.StandardDate.wYear = 0; CString l_csDaylightName = _T("India Daylight Time"); CString l_csStdName = _T("India Standard Time"); wcscpy(l_TimeZoneInfo.DaylightName,l_csDaylightName.GetBuffer(l_csDaylightName.GetLength())); wcscpy(l_TimeZoneInfo.StandardName,l_csStdName.GetBuffer(l_csStdName.GetLength())); ::SetLastError(0); if(0 == ::SetTimeZoneInformation(&l_TimeZoneInfo)) { l_dwRetVal = ::GetLastError(); CString l_csErr = _T(""); l_csErr.Format(_T("%d"),l_dwRetVal); } The MFC application has been developed using Visual Studio 2008 and is UAC aware i.e. the application has UAC enabled in its manifest file with the UAC execution level set to “HighestAvailable”. I have administrator privileges and when I run the application it still fails to change the system zone setting. Thanks in Advance, Ganesh

    Read the article

  • Switch statement usage - C

    - by Jamie Keeling
    Hello, I have a thread function on Process B that contains a switch to perform certain operations based on the results of an event sent from Process A, these are stored as two elements in an array. I set the first element to the event which signals when Process A has data to send and I have the second element set to the event which indicates when Process A has closed. I have began to implement the functionality for the switch statement but I'm not getting the results as I expect. Consider the following: // //Thread function DWORD WINAPI ThreadFunc(LPVOID passedHandle) { for(i = 0; i < 2; i++) { ghEvents[i] = OpenEvent(EVENT_ALL_ACCESS, FALSE, TEXT("Global\\ProducerEvents")); if(ghEvents[i] == NULL) { getlasterror = GetLastError(); } } dwProducerEventResult = WaitForMultipleObjects( 2, ghEvents, FALSE, INFINITE); switch (dwProducerEventResult) { case WAIT_OBJECT_0 + 0: { //Producer sent data //unpackedHandle = *((HWND*)passedHandle); MessageBox(NULL,L"Test",L"Test",MB_OK); break; } case WAIT_OBJECT_0 + 1: { //Producer closed ExitProcess(1); break; } default: return; } } As you can see if the event in the first array is signalled Process B should display a simple message box, if the second array is signalled the application should close. When I actually close Process A, Process B displays the message box instead. If I leave the first case blank (Do nothing) both applications close as they should. Furthermore Process B sends data an error is thrown (When I comment out the unpacking): Have I implemented my switch statement incorrectly? I though I handled the unpacking of the HWND correctly too, any suggestions? Thanks for your time.

    Read the article

  • Need to call original function from detoured function

    - by peachykeen
    I'm using Detours to hook into an executable's message function, but I need to run my own code and then call the original code. From what I've seen in the Detours docs, it definitely sounds like that should happen automatically. The original function prints a message to the screen, but as soon as I attach a detour it starts running my code and stops printing. The original function code is roughly: void CGuiObject::AppendMsgToBuffer(classA, unsigned long, unsigned long, int, classB); My function is: void CGuiObject_AppendMsgToBuffer( [same params, with names] ); I know the memory position the original function resides in, so using: DWORD OrigPos = 0x0040592C; DetourAttach( (void*)OrigPos, CGuiObject_AppendMsgToBuffer); gets me into the function. This code works almost perfectly: my function is called with the proper parameters. However, execution leaves my function and the original code is not called. I've tried jmping back in, but that crashes the program (I'm assuming the code Detours moved to fit the hook is responsible for the crash). Edit: I've managed to fix the first issue, with no returning to program execution. By calling the OrigPos value as a function, I'm able to go to the "trampoline" function and from there on to the original code. However, somewhere along the lines the registers are changing and that is causing the program to crash with a segfault as soon as I get back into the original code.

    Read the article

  • How to manipulate a header and then continue with it in C#?

    - by Simon Linder
    Hi all, I want to replace an old ISAPI filter that ran on IIS6. This filter checks if the request is of a special kind, then manipulates the header and continues with the request. Two headers are added in the manipulating method that I need for calling another special ISAPI module. So I have ISAPI C++ code like: DWORD OnPreProc(HTTP_FILTER_CONTEXT *pfc, HTTP_FILTER_PREPROC_HEADERS *pHeaders) { if (ManipulateHeaderInSomeWay(pfc, pHeaders)) { return SF_STATUS_REQ_NEXT_NOTIFICATION; } return SF_STATUS_REQ_FINISHED; } I now want to rewrite this ISAPI filter as a managed module for the IIS7. So I have something like this: private void OnMapRequestHandler(HttpContext context) { ManipulateHeaderInSomeWay(context); } And now what? The request seems not to do what it should? I already wrote an IIS7 native module that implements the same method. But this method has a return value with which I can tell what to do next: REQUEST_NOTIFICATION_STATUS CMyModule::OnMapRequestHandler(IN IHttpContext *pHttpContext, OUT IMapHandlerProvider *pProvider) { if (DoSomething(pHttpContext)) { return RQ_NOTIFICATION_CONTINUE; } return RQ_NOTIFICATION_FINISH_REQUEST; } So is there a way to send my manipulated context again?

    Read the article

  • ERROR_MORE_DATA ---- Reading from Registry

    - by user314749
    I am trying to create an offline registry in memory using the offreg.dll provided in the windows ddk 7 package. You can find out more information on the offreg.dll here: MSDN Currently, while attempting to read a value from an open registry hive / key I receive the following error: 234 or ERROR_MORE_DATA Here is the .h code that contains ORGetValue: DWORD ORAPI ORGetValue ( __in ORHKEY Handle, __in_opt PCWSTR lpSubKey, __in_opt PCWSTR lpValue, __out_opt PDWORD pdwType, __out_bcount_opt(*pcbData) PVOID pvData, __inout_opt PDWORD pcbData ); Here is the code that I am using to pull the data [DllImport("offreg.dll", CharSet = CharSet.Auto, EntryPoint = "ORGetValue", SetLastError = true, CallingConvention = CallingConvention.StdCall)] public static extern uint ORGetValue(IntPtr Handle, string lpSubKey, string lpValue, out uint pdwType, out string pvData, out uint pcbData); IntPtr myHive; IntPtr myKey; string myValue; uint pdwtype; uint pcbdata; uint ret3 = ORGetValue(myKey, "", "DefaultUserName", out pdwtype, out myValue, out pcbdata); The goal is to be able to read myValue as a string. I am not sure if I need to use marshaling... or a second call with an adjusted buffer.. Or really how to adjust the buffer in C#. Any help or pointers would be greatly appreciated. Thank you.

    Read the article

  • Windows 7 dsound.dll load from dll crash

    - by Jonas Byström
    I'm getting a crash when loading dsound.dll from another DLL in Windows 7. The following code crashes: #include <Windows.h> #include <mmreg.h> #include <dsound.h> #include <assert.h> HRESULT (WINAPI *pDirectSoundEnumerateA)(LPDSENUMCALLBACKA pDSEnumCallback, LPVOID pContext); HMODULE hDsound; BOOL CALLBACK DSEnum(LPGUID a, LPCSTR b, LPCSTR c, LPVOID d) { return TRUE; } void CrashTest() { HRESULT hr; hDsound = LoadLibraryA("dsound.dll"); assert(hDsound); *(void**)&pDirectSoundEnumerateA = (void*)GetProcAddress(hDsound, "DirectSoundEnumerateA"); assert(pDirectSoundEnumerateA); hr = pDirectSoundEnumerateA(DSEnum, NULL); assert(!FAILED(hr)); } BOOL APIENTRY DllMain(HANDLE hModule,DWORD ul_reason_for_call,LPVOID lpReserved) { if (ul_reason_for_call == DLL_PROCESS_ATTACH) { DisableThreadLibraryCalls(hModule); CrashTest(); } } with this error code: Unhandled exception at ... in ...: 0xC0000005: Access violation reading location 0x00000044. (it's always 0x44 for some reason). It works on Windows XP or when loading directly from the .exe (not from a separate DLL). Help!?! :)

    Read the article

  • Switch/case without break inside DllMain

    - by Sherwood Hu
    I have a Dllmain that allocates Thread local storage when a thread attaches to this DLL. Code as below: BOOL APIENTRY DllMain(HMODULE hModule, DWORD ul_reason_for_call, LPVOID lpReserved) { LPVOID lpvData; BOOL fIgnore; switch (ul_reason_for_call) { case DLL_PROCESS_ATTACH: onProcessAttachDLL(); // Allocate a TLS index. if ((dwTlsIndex = TlsAlloc()) == TLS_OUT_OF_INDEXES) return FALSE; // how can it jump to next case??? case DLL_THREAD_ATTACH: // Initialize the TLS index for this thread. lpvData = (LPVOID) LocalAlloc(LPTR, MAX_BUFFER_SIZE); if (lpvData != NULL) fIgnore = TlsSetValue(dwTlsIndex, lpvData); break; ... } I know that for the main thread, the DLL_THREAD_ATTACH is not entered, as per Microsoft Documentation. However, the above code worked. I am using VC2005. When I entered the debugger, I saw that after it entered DLL_THREAD_ATTACH case when ul_reason_for_call = 1! How can that happen? If I add `break' at the end of DLL_PROCESS_ATTACH block, the DLL failed to work. How can this happen?

    Read the article

  • iPod library song path access

    - by Narendra Kumar
    I studied a lot but did not find any good answer. My problem is i am calculating beats per minute of song.I used Bass api for that, now problem is i am able to get bpm of a file which i have in my resource folder but i have to get bpm of all songs of iPod library. I am getting path of song from MPMediaItemPropertyAssetURL property of MpMediaItem but when passing this one in api api is saying stream cant load BASS_StreamCreateFile(). In my point of view i am not getting right path of song. How can we access valid path? Did any one access ipod library song with external api? Please help me . Thanks CODE IS THIS NSURL *assetURL = [song valueForProperty:MPMediaItemPropertyAssetURL]; NSString *respath = [NSString stringWithFormat:@"%@",[assetURL absoluteString]]; BASS_SetConfig(BASS_CONFIG_IOS_MIXAUDIO, 0); // Disable mixing. To be called before BASS_Init. if (HIWORD(BASS_GetVersion()) != BASSVERSION) { NSLog(@"An incorrect version of BASS was loaded"); } // Initialize default device. if (!BASS_Init(-1, 44100, 0, NULL, NULL)) { //textView.text = [NSString stringWithFormat:@"%@ CAN'T Load Stream",textView.text]; } DWORD chan1; if(!(chan1=BASS_StreamCreateFile(FALSE, [respath UTF8String], 0, 0, BASS_SAMPLE_LOOP))) { NSLog(@"Can't load stream!"); textView.text = [NSString stringWithFormat:@"%@ not loading...",textView.text]; } mainStream=BASS_StreamCreateFile(FALSE, [respath cStringUsingEncoding:NSUTF8StringEncoding], 0, 0, BASS_SAMPLE_FLOAT|BASS_STREAM_PRESCAN|BASS_STREAM_DECODE); float playBackDuration=BASS_ChannelBytes2Seconds(mainStream, BASS_ChannelGetLength(mainStream, BASS_POS_BYTE)); NSLog(@"Play back duration is %f",playBackDuration); HSTREAM bpmStream=BASS_StreamCreateFile(FALSE, [respath UTF8String], 0, 0, BASS_STREAM_PRESCAN|BASS_SAMPLE_FLOAT|BASS_STREAM_DECODE); //BASS_ChannelPlay(bpmStream,FALSE); BpmValue= BASS_FX_BPM_DecodeGet(bpmStream,0.0, playBackDuration, MAKELONG(45,256), BASS_FX_BPM_MULT2| BASS_FX_BPM_MULT2 | BASS_FX_FREESOURCE, (BPMPROCESSPROC*)proc); textView.text = [NSString stringWithFormat:@"%@ %f",textView.text,BpmValue];

    Read the article

  • Mixing c++ standard strings and windows API

    - by JB
    Many windows APIs take a pointer to a buffer and a size element but the result needs to go into a c++ string. (I'm using windows unicode here so they are wstrings) Here is an example :- #include <iostream> #include <string> #include <vector> #include <windows.h> using namespace std; // This is the method I'm interested in improving ... wstring getComputerName() { vector<wchar_t> buffer; buffer.resize(MAX_COMPUTERNAME_LENGTH+1); DWORD size = MAX_COMPUTERNAME_LENGTH; GetComputerNameW(&buffer[0], &size); return wstring(&buffer[0], size); } int main() { wcout << getComputerName() << "\n"; } My question really is, is this the best way to write the getComputerName function so that it fits into C++ better, or is there a better way? I don't see any way to use a string directly without going via a vector unless I missed something? It works fine, but somehow seems a little ugly. The question isn't about that particular API, it's just a convenient example.

    Read the article

  • Dev-C++ and Detours compiling error

    - by Julio
    Hello. As title says I'm trying to compile with Dev-C++ a simple DLL using Detours, but I get this error: syntax error before token '&' on this lines: DetourAttach(&(PVOID &)trueMessageBox, hookedMessageBox) DetourDetach(&(PVOID &)trueMessageBox, hookedMessageBox) The complete code is #include <windows.h> #include <detours.h> #pragma comment( lib, "Ws2_32.lib" ) #pragma comment( lib, "detours.lib" ) #pragma comment( lib, "detoured.lib" ) int (WINAPI * trueMessageBox)(HWND hWnd, LPCSTR lpText, LPCSTR lpCaption, UINT uType) = MessageBox; int WINAPI hookedMessageBox(HWND hWnd, LPCSTR lpText, LPCSTR lpCaption, UINT uType) { LPCSTR lpNewCaption = "You've been hijacked"; int iReturn = trueMessageBox(hWnd, lpText, lpNewCaption, uType); return iReturn; } BOOL WINAPI DllMain( HINSTANCE, DWORD dwReason, LPVOID ) { switch ( dwReason ) { case DLL_PROCESS_ATTACH: DetourTransactionBegin(); DetourUpdateThread( GetCurrentThread() ); DetourAttach(&(PVOID &)trueMessageBox, hookedMessageBox) DetourTransactionCommit(); break; case DLL_PROCESS_DETACH: DetourTransactionBegin(); DetourUpdateThread( GetCurrentThread() ); DetourDetach(&(PVOID &)trueMessageBox, hookedMessageBox) DetourTransactionCommit(); break; } return TRUE; }

    Read the article

  • Using an SHA1 with Microsoft CAPI

    - by Erik Jõgi
    I have an SHA1 hash and I need to sign it. The CryptSignHash() method requires a HCRYPTHASH handle for signing. I create it and as I have the actual hash value already then set it: CryptCreateHash(cryptoProvider, CALG_SHA1, 0, 0, &hash); CryptSetHashParam(hash, HP_HASHVAL, hashBytes, 0); The hashBytes is an array of 20 bytes. However the problem is that the signature produced from this HCRYPTHASH handle is incorrect. I traced the problem down to the fact that CAPI actually doesn't use all 20 bytes from my hashBytes array. For some reason it thinks that SHA1 is only 4 bytes. To verify this I wrote this small program: HCRYPTPROV cryptoProvider; CryptAcquireContext(&cryptoProvider, NULL, NULL, PROV_RSA_FULL, 0); HCRYPTHASH hash; HCRYPTKEY keyForHash; CryptCreateHash(cryptoProvider, CALG_SHA1, keyForHash, 0, &hash); DWORD hashLength; CryptGetHashParam(hash, HP_HASHSIZE, NULL, &hashLength, 0); printf("hashLength: %d\n", hashLength); And this prints out hashLength: 4 ! Can anyone explain what I am doing wrong or why Microsoft CAPI thinks that SHA1 is 4 bytes (32 bits) instead of 20 bytes (160 bits).

    Read the article

  • Registry entry using VC++, Data corrupting

    - by sijith
    Hi I want to set system time to registry, i did like this. But some null characters only getting there. when i am giving LPCTSTR data = TEXT("24/3/2010\0"); LONG setRes = RegSetValueEx (hkey, value, 0, REG_SZ, (LPBYTE)data, 100)); thsi is sucessfully adding into registry How to trace the issue IF possible please check my code include include include void Regkey::create_Registry() { HKEY hkey; DWORD dwDisposition,lpData; SYSTEMTIME time; GetLocalTime( &time ); int hour = time.wHour; if (hour 12) hour -= 12; char szData[20]; sprintf (szData, "%02d/%02d/%04d", time.wDay, time.wMonth, time.wYear); if(RegCreateKeyEx(HKEY_LOCAL_MACHINE, TEXT("Software\Sijith\Test"), 0, NULL, 0, 0, NULL, &hkey, &dwDisposition)== ERROR_SUCCESS) { LPCTSTR sk = TEXT("Software\Sijith\Test"); LONG openRes = RegOpenKeyEx(HKEY_LOCAL_MACHINE, sk, 0, KEY_ALL_ACCESS , &hkey); LPCTSTR value = TEXT("CheckSoftwareKey"); LONG setRes = RegSetValueEx (hkey, value, 0, REG_SZ, (CONST BYTE *)szData, sizeof(TCHAR) * (_tcslen(szData) + 1)); RegCloseKey(hkey); } } output value name: CheckSoftwareKey valueData: ?????

    Read the article

  • Using an SHA1 with Micrsoft CAPI

    - by Erik Jõgi
    Hello, I have an SHA1 hash and I need to sign it. The CryptSignHash() method requires a HCRYPTHASH handle for signing. I create it and as I have the actual hash value already then set it: CryptCreateHash(cryptoProvider, CALG_SHA1, 0, 0, &hash); CryptSetHashParam(hash, HP_HASHVAL, hashBytes, 0); The hashBytes is an array of 20 bytes. However the problem is that the signature produced from this HCRYPTHASH handle is incorrect. I traced the problem down to the fact that CAPI actually doesn't use all 20 bytes from my hashBytes array. For some reason it thinks that SHA1 is only 4 bytes. To verify this I wrote this small program: HCRYPTPROV cryptoProvider; CryptAcquireContext(&cryptoProvider, NULL, NULL, PROV_RSA_FULL, 0); HCRYPTHASH hash; HCRYPTKEY keyForHash; CryptCreateHash(cryptoProvider, CALG_SHA1, keyForHash, 0, &hash); DWORD hashLength; CryptGetHashParam(hash, HP_HASHSIZE, NULL, &hashLength, 0); printf("hashLength: %d\n", hashLength); And this prints out hashLength: 4 ! Can anyone explain what I am doing wrong or why Microsoft CAPI thinks that SHA1 is 4 bytes (32 bits) instead of 20 bytes (160 bits). Thank you.

    Read the article

  • Why is the exception thrown on memcpy using during copying LPBYTE to LPTSTR (clipboard)?

    - by user46503
    Hello, I have a LPBYTE array (taken from file) and I need to copy it into LPTSRT (actually into the clipboard). The trouble is copying work but unstable, sometime an exception was thrown (not always) and I don't understand why. The code is: FILE *fConnect = _wfopen(connectFilePath, _T("rb")); if (!fConnect) return; fseek(fConnect, 0, SEEK_END); lSize = ftell(fConnect); rewind(fConnect); LPBYTE lpByte = (LPBYTE) malloc(lSize); fread(lpByte, 1, lSize, fConnect); lpByte[lSize] = 0; fclose(fConnect); //Copy into clipboard BOOL openRes = OpenClipboard(NULL); if (!openRes) return; DWORD err = GetLastError(); EmptyClipboard(); HGLOBAL hText; hText = GlobalAlloc(GMEM_MOVEABLE, (lSize+ sizeof(TCHAR))); LPTSTR sMem = (TCHAR*)GlobalLock(hText); memcpy(sMem, lpByte, (lSize + sizeof(TCHAR))); The last string is the place where the exception is thrown. Thanks a lot

    Read the article

  • CreateThread() fails on 64 bit Windows, works on 32 bit Windows. Why?

    - by Stephen Kellett
    Operating System: Windows XP 64 bit, SP2. I have an unusual problem. I am porting some code from 32 bit to 64 bit. The 32 bit code works just fine. But when I call CreateThread() for the 64 bit version the call fails. I have three places where this fails. 2 call CreateThread(). 1 calls beginthreadex() which calls CreateThread(). All three calls fail with error code 0x3E6, "Invalid access to memory location". The problem is all the input parameters are correct. HANDLE h; DWORD threadID; h = CreateThread(0, // default security 0, // default stack size myThreadFunc, // valid function to call myParam, // my param 0, // no flags, start thread immediately &threadID); All three calls to CreateThread() are made from a DLL I've injected into the target program at the start of the program execution (this is before the program has got to the start of main()/WinMain()). If I call CreateThread() from the target program (same params) via say a menu, it works. Same parameters etc. Bizarre. If I pass NULL instead of &threadID, it still fails. If I pass NULL as myParam, it still fails. I'm not calling CreateThread from inside DllMain(), so that isn't the problem. I'm confused and searching on Google etc hasn't shown any relevant answers. If anyone has seen this before or has any ideas, please let me know. Thanks for reading.

    Read the article

  • error C2146: syntax error : missing ';' before identifier 'g_App'

    - by numerical25
    I wish c++, was a little more specific on the messages they give. The following error is being thrown in the document below main.h #ifndef main_h #define main_h //includes #include <windows.h> #include <commctrl.h> #include <d3d9.h> #include <fstream> #include "capplication.h" //constants #define TITLE "D3D Tut 01: Create Window" #define WINDOW_X 350 #define WINDOW_Y 320 //Button ID's #define ID_START 1 #define ID_CANCEL 2 //globals extern CApplication g_App; //function prototypes LRESULT CALLBACK WindowProcedure(HWND,UINT,WPARAM,LPARAM); #endif The only header file that could possible throw this error is the capplication.h. given below capplication.h #ifndef capplication_h #define capplication_h #include"main.h" class CApplication { public: CApplication(void); ~CApplication(void); void InitWindow(void); void SaveSettings(void); void LoadSettings(void); void KillWindow(void); inline bool GetWindowStatus(void) { return m_bRunningWindow; } inline HWND GetWindowHandle(void) { return m_hWindow; } inline void SetWindowStatus(bool bRunningWindow) { m_bRunningWindow = bRunningWindow; } private: bool m_bRunningWindow; HWND m_hWindow, m_hBtnStart, m_hBtnCancel, m_hLblResolution, m_hCbResolution, m_hLblBackBuffer, m_hCbBackBuffer, m_hLblDepthStencil, m_hCbDepthStencil, m_hLblVertexProcessing, m_hCbVertexProcessing, m_hLblMultiSampling, m_hCbMultiSampling, m_hLblAnisotropy, m_hCbAnisotropy; DWORD m_dwWidth, m_dwHeight, m_dwVertexProcessing, m_dwAnisotropy; D3DFORMAT m_ColorFormat, m_DepthStencilFormat; D3DMULTISAMPLE_TYPE m_MultiSampling; }; #endif Besides that, the only suspicious thing I see is fstream given in the first code. I did have it as fstream.h But VC++ was not recognizing it so I was told to remove the h and I did. now I am down to this error. and I have no clue what it could be. Possibly something obvious

    Read the article

  • RegQueryValueEx not working with a Release version but working fine with Debug

    - by Nux
    Hi. I'm trying to read some ODBC details form a registry and for that I use RegQueryValueEx. The problem is when I compile the release version it simply cannot read any registry values. The code is: CString odbcFuns::getOpenedKeyRegValue(HKEY hKey, CString valName) { CString retStr; char *strTmp = (char*)malloc(MAX_DSN_STR_LENGTH * sizeof(char)); memset(strTmp, 0, MAX_DSN_STR_LENGTH); DWORD cbData; long rret = RegQueryValueEx(hKey, valName, NULL, NULL, (LPBYTE)strTmp, &cbData); if (rret != ERROR_SUCCESS) { free(strTmp); return CString("?"); } strTmp[cbData] = '\0'; retStr.Format(_T("%s"), strTmp); free(strTmp); return retStr; } I've found a workaround for this - I disabled Optimization (/Od), but it seems strange that I needed to do that. Is there some other way? I use Visual Studio 2005. Maybe it's a bug in VS? Almost forgot - the error code is 2 (as the key wouldn't be found).

    Read the article

  • Ping broadcast on Win XP SP3

    - by PaulH
    I'm trying to ping the broadcast address 255.255.255.255 on WinXP SP3. If I use the command line, I get host error: C:\>ping 255.255.255.255 Ping request could not find host 255.255.255.255. Please check the name and try again. If I try a C++ program using the iphlpapi, IcmpSendEcho() fails and GetLastError returns 11010 IP_REQ_TIMED_OUT. HANDLE h = ::IcmpCreateFile(); IPAddr broadcast = inet_addr( "255.255.255.255" ); BYTE payload[ 32 ] = { 0 }; IP_OPTION_INFORMATION option = { 255, 0, 0, 0, 0 }; // a buffer with room for 32 replies each containing the full payload std::vector< BYTE > replies( 32 * ( sizeof( ICMP_ECHO_REPLY ) + 32 ) ); DWORD res = ::IcmpSendEcho( h, broadcast, payload, sizeof( payload ), &option, &replies[ 0 ], replies.size(), 1000 ); ::IcmpCloseHandle( h ); I can ping the local broadcast 192.168.0.255 with no problem. What do I need to do to ping the global broadcast? Thanks, PaulH

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16  | Next Page >