Use the content inside the codeblock to see how Texmorph reacts to them and resolves issues.
# Texmorph English Test
# Test 1: Citation Tag Pollution
According to recent research[cite:1], deep learning has achieved breakthrough progress in image recognition[cite:2][cite:3]. Particularly in medical imaging analysis[cite:45], CNN models have surpassed human expert accuracy[cite:source_2024]. The application prospects of this technology are very promising[cite:ref_a][cite:ref_b].
Additionally, reinforcement learning[^1] has demonstrated remarkable capabilities in game AI[^2]. The success of AlphaGo[^3] proves this point.
# Test 2: LaTeX Formulas Incorrectly Bolded
In quantum mechanics, the Schrödinger equation is one of the most fundamental equations:
**$$i\hbar\frac{\partial}{\partial t}\Psi(\mathbf{r},t) = \hat{H}\Psi(\mathbf{r},t)$$**
Where **$\hbar$** is the reduced Planck constant and **$\Psi$** is the wave function.
The energy eigenvalue equation can be written as **$\hat{H}\psi = E\psi$**, where **$E$** represents the energy eigenvalue.
Einstein's mass-energy equation **$$E = mc^2$$** reveals the equivalence of mass and energy.
# Test 3: LaTeX Formulas Incorrectly Italicized
Euler's formula is called the most beautiful mathematical formula:
*$$e^{i\pi} + 1 = 0$$*
It connects five of the most important mathematical constants: *$e$*, *$i$*, *$\pi$*, *$1$*, and *$0$*.
Bayes' theorem: *$$P(A|B) = \frac{P(B|A)P(A)}{P(B)}$$*
# Test 4: Mixed Pollution (Citations + Format Errors)
According to the paper[cite:arxiv_2024_001], the self-attention mechanism in Transformer models can be expressed as:
**$$\text{Attention}(Q, K, V) = \text{softmax}\left(\frac{QK^T}{\sqrt{d_k}}\right)V$$**
Where **$Q$**, **$K$**, **$V$** represent the query, key, and value matrices respectively[cite:vaswani2017]. This formula[^4] is the cornerstone of modern NLP[cite:bert][cite:gpt].
The model's loss function typically uses cross-entropy: *$$\mathcal{L} = -\sum_{i} y_i \log(\hat{y}_i)$$*[cite:loss_func]
# Test 5: Markdown Format Removal Test
### This is a Level 3 Heading
This paragraph contains **bold**, *italic*, ***bold italic***, ~~strikethrough~~, and inline code.
* This is unordered list item 1
* This is unordered list item 2
* Nested list item
* This is unordered list item 3
1 Ordered list item 1
2 Ordered list item 2
3 Ordered list item 3
⠀This is a block quote It can have multiple lines
Here is a ~[link text](https://example.com/)~ and an image:

# Test 6: Extreme Mixed Case
**According to research[cite:extreme_test], the following formula is very important:**
**$$\int_{-\infty}^{\infty} e^{-x^2} dx = \sqrt{\pi}$$**[cite:gaussian]
This *Gaussian integral*[^5] result **$\sqrt{\pi}$** has wide applications in probability theory[cite:prob].
1 First, we have **$f(x) = e^{-x^2}$**[cite:step1]
2 Then, calculate *$$I^2 = \int\int e^{-(x^2+y^2)} dx,dy$$*[cite:step2]
3 Finally we get **$$I = \sqrt{\pi}$$**[cite:step3]
⠀
# Test 7: Pure English Content + Citations
The history of artificial intelligence[cite:ai_history] can be traced back to the 1950s[cite:turing]. Turing proposed the famous "Turing Test"[cite:turing_test] to determine whether a machine possesses intelligence[^6].
In recent years, large language models[cite:llm] such as GPT[cite:gpt4] and Claude[cite:anthropic] have demonstrated remarkable capabilities[cite:emergent]. The parameter count of these models has reached the **$10^{11}$** scale[cite:scaling_laws].
# Test 8: Code Block + Formula Mixed
Implementing the softmax function in Python, corresponding to the formula **$$\sigma(z)*****i = \frac{e^{z_i}}{\sum*****{j=1}^{K} e^{z_j}}$$**[cite:softmax]:
### import numpy as np
### def softmax(x):
### exp_x = np.exp(x - np.max(x))
### return exp_x / exp_x.sum()
Where **$z_i$** is the *$i$*-th element of the input vector[cite:impl_note].